The Most Technical Issue of VR
With the release of devices like the HTC Vive and the Oculus Rift, the previously quite exclusive worlds of virtual reality became available to the wide public. Game developers worldwide have started delving into these new worlds, often sadly running into many issues that have been previously investigated in 40 years of virtual reality research. Most of these issues are somehow related to how the user perceives the virtual environment. This talk wants to showcase the vast possibilities and the challenges of virtual reality and to give the participants a glimpse of previous research to seed new ideas and to ease their first steps in this field.
Towards More Adaptive and Creative Artificial Intelligence in Games
In this talk I will review recent advances in neuroevolution (i.e. evolving artificial neural networks through evolutionary algorithms) and procedural content generation that are especially relevant to creative domains like games. Evolutionary computation offers unique affordances for game design that enables creating entirely new type of games. Examples of such games we developed include Petalz, in which users can evolve an unlimited amount of artificial flowers and then transfer them to the real world via 3D printing, and Artefacts, that is similar to Minecraft but allows players to evolve arbitrary 3D buildings blocks together with others. Given their explorative nature, evolutionary algorithms also enable new ways for humans and machines to collaborate to easily create NPC behaviours. I will show how casual users can create NPC controllers for the Super Mario Game without any domain or programming knowledge.
Applications of Eye Tracking in Virtual Reality
Eye tracking is a hot topic in VR, but what is all the fuzz about? Tom will present how eye tracking can be used to personalize the 3D experience and how the point of gaze revolutionizes interaction with a virtual world. The power of eye tracking is not limited to interaction alone. In the long run, eye tracking can make logins obsolete, and using foveated rendering, make high resolution displays possible in HMDs. Social interaction is considered one of the killer applications and critical for mass adoption of Virtual Reality. A key component for that is a natural representation of eye movements in avatars. Tom will outline how this can be done and present both the state-of-the-art and a leap into the future of eye tracking in VR.
The Disruptive Potential of Applied Interactive Technologies (APITs)
Applying technologies from the games and consumer electronics sector to other industries and use cases has a huge potential. The talk will highlight this potential by describing experience from projects of the Creative Media R&D Group (@HTW Berlin) as well as by showcasing developed prototypes. These apply e.g. Virtual and Augmented Reality, Game Engines and Wearables to themes such as co-creational exhibition design, industrial plant planning and maintenance, as well as facility/building management and automation. The talk will be completed by an overview of networking and funding opportunities to support enterprises that want to explore APITs in diverse contexts.
The Future of Virtual Reality Video
Besides gaming, Virtual Reality video is currently one of the most researched topics in the Virtual Reality community. There is a wide variety of possible formats ranging from standard monoscopic 360° equirectangular to future volumetric video approaches e.g. using depth maps or light fields. This talk will give a short introduction to the current state from production to playout and its limitations. Further the talk will go into detail on future advanced methods that improve the current state, such as foveated streaming, spatial audio and volumetric video.
Drone Swarm - The Tech Behind 32,000 Simulated Space Drones
Drone Swarm is a Sci-Fi RTT (real time tactics games) set in space, in which you control a swarm of 32,000 drones at once! Every single drone is simulated and rendered individually in a 3D space and consists of several polygons.
In this talk I'll show how we managed to get this to run in real-time. Drone Swarm is made in Unity, but the general principles involved apply to all game engines.
On our way to make this core-tech work we faced several major challenges we had to overcome:
I'll talk about all of these challenges and how we tackled them.
Summing up the talk, I'll show a brief video of the resulting swarm dynamics in our current game-prototype.
Version Management for Games Developers. Practical considerations and Codeline Management
Version control is an essential tool for any kind of software development and increasingly hardware development as well. But video game development is different compared to most other software projects. The majority of files involved are binary files that cannot be merged - and these files tend to be very large as well. Many game development projects grow to Terabytes, and the upcoming 4K games for consoles and PC will quadruple the amount of assets again. On the other hand there is a strong tendency by the technical leads and QA team to demand a single repository for all files that go into a game - a Single Source of Truth. A Single Source of Truth avoids the pitfalls of mismatched check-ins from different silos of data. Such a repository also simplifies the inevitable search for the one check-in that broke the latest build. Ideally such a repository should also be distributed to allow multiple development sites to work on the same project if required - according to large games developers like EA it is not possible anymore to find all the talent required in a single place. This presentation introduces same basic concepts of version management, some specific requirements for game development and some techniques taken from real development on how these requirements are met.
Rapid UI Creation using Data Binding in Unity
Modern application development architectures often use user interface patterns like Model-View-Controller and Model-View-ViewModel, which provide a clear structure for both developer and artist workflows, and direct support in the IDE. However, the current status quo in mobile game development utilizes engines like Unity3D which do not have native support for similar workflows. This results in a more error-prone and redundant user interface development workflow, where artists create a user interface remotely and developers have to mimic it in the engine. This talk shows how to apply modern development architectures to Unity3D, allowing developers and artists to focus each on their expertise and still have the results align in harmony. By applying proven concepts, these modern developments allow each party to work independently while still being able to test their own work at all times.