http://www.eurogamer.net/articles/2...one-cloud-message-shifts-to-dedicated-servers
Video of the demo is available in the article.
I remain unconvinced in the crucial area. How are the economics of that infrastructure going to work? I wrote on that in an earlier thread:
Offload me to the Cloud if old.
There was not one mention of "the cloud" during Microsoft's 90 minute presentation, either from Xbox boss Phil Spencer or the many developers that took to the stage to talk about their games.
Instead, we heard the term "dedicated servers" over and over again. This after Microsoft spent a great deal of time and effort insisting the power of the cloud would revolutionise gaming as we know it.
And that - finally - is how Microsoft is now referring to the cloud. Is it an official rebrand?
"You picked up on exactly that," Phil Harrison told me at E3 last week when I mentioned Microsoft failed to mention "the cloud" once during its press conference last week.
"Xbox Live is the service. Dedicated servers is the benefit. That is the reason why these games are going to be better, why the experience for multiplayer is going to be better.
Xbox boss Phil Spencer confirmed on Twitter that the video, below, featured early Crackdown work. It shows how the cloud can make the destruction of a building, for example, faster and smoother. The fancy Crackdown trailer shown off at E3 last week also featured a building being destroyed.
We've only seen a hint of what's possible so far beyond multiplayer gaming with Drivatars and Titanfall's grunt AI. The Crackdown prototype is a great example of what the cloud should excel at - offloading complex calculations away from the host console, where the additional 100ms or so latency to and from the datacentre won't unduly impact gameplay. The cloud doesn't address graphics bottlenecks, but here it demonstrates how much of a strain simulating destruction of a complex scene can have on the CPU - an area where both PS4 and Xbox One lag behind even mid-range PC processors.
"The question is really how much CPU time Microsoft is willing to dedicate to each game instance. I suspect that the Crackdown prototype uses an order of magnitude more CPU power than, say, the grunt AI in Titanfall. It'll be an interesting stress test of the Azure infrastructure to see if it can hold its own in a game likely to break the million-sales barrier in short order."
Video of the demo is available in the article.
I remain unconvinced in the crucial area. How are the economics of that infrastructure going to work? I wrote on that in an earlier thread:
No, and the technological reasons have been discussed a lot, but apart from them, it just doesn't make sense.
There is no reason to make a "hybrid" application for performance reasons, that is, a game that runs partially on local hardware and partially on remote infrastructure. You would have a distributed software system which is inherently more difficult to engineer and test. And you would combine expensive local hardware with expensive remote infrastructure. It doesn't make sense.
The only reasonable design for a cloud-based gaming infrastructure that scales its performance over time is to run games entirely in the cloud and stream their input/output to a generic thin client. This way, the game itself is not architecturally distributed and, hence, much easier to develop. Nevertheless, you would still benefit from the cloud's benefits like resource-efficiency and the outsourcing of upgrades and maintenance from customers to service providers.
There are, of course, features that you need a server infrastructure for, but these things are inherently in need of networking, for instance, synchronization of players in multiplayer games, social stuff, aggregation and analysis of data, etc. But nobody is "offloading" stuff that would otherwise run locally for performance reasons. Performance may benefit as a side-effect (e.g., a console does not have to run the server in multiplayer games) but the functionality itself would be inherently network-based.
This is true not just for technological reasons, but also for economic reasons: why would anyone pay for a cloud infrastructure to implement features that no player would care enough for to justify its price? Nobody would pay a monthly fee to run cloud-based servers that calculate pre-baked lighting. This is ridiculous. The developer would scale such features down such that they run on the local hardware. It's easier, and it's free.
And finally, I haven't heard of any convincing idea for performance-motivated offloading of subsystems to a server. I have just seen some developers merely philosophizing about what might be technically possible without taking into account the necessary development costs or the actual impact of this idea. Everything else has been done long ago in online games
Offload me to the Cloud if old.