Building a better Pokemon GO!

Felcjo Ringo
5 min readApr 3, 2019

Gotta truly interact with ’em all — with AR!

Yes, I Really Wanted to Interact with my Pokemon

Pokemon GO was the AR phenomenon that took the world by storm in July 2016. Who can forget the huge groups of people running through public parks in an attempt to Catch ’em All? The app was so popular, that, at its peak, it had over 20 million active daily users.

While no one can doubt that the novel interactive experience was fun, there was one thing that bothered many techies. The app was being hailed as the Augmented Reality killer app, however the app did not have any features those in the field usually refer to as AR. Sure, it had geolocation features to plant Pokemon and Pokestops at certain GPS coordinates, but it did not incorporate any real-world tracking nor ‘space-carving’ features such as ground plane detection or SLAM (Simultaneous Localization and Mapping). When a player entered into a battle with a Pokemon, it showed an animated Pokemon against a parallaxed background. That is, the Pokemon did not appear to walk around the game world, rather they simply appeared within a still camera image of the world taken at the start of the battle.

The still image affect is a big derp for me.

What is interesting is that Pokemon GO developer Niantic used their existing geolocation framework in their previous game, Ingress, in which players would roam around real-world locations and go on missions that involve interacting with the world through the game’s overlay. Nintendo noticed this game and later approached Niantic to take the Pokemon IP and make a game out of it loosely based on the Ingress concept.

When I first heard of Pokemon GO, I was hoping to get the experience of interacting with the Pokemon roaming around our world .. and then of course capturing them for my sick amusement of catching ‘em all. I wanted to move my camera around and have Mr Pikachu smile at me and wave, and see Charizard’s fiery breath lash out at me in real combat. Now that I’ve had time to process my disappointment, I figured I would try to build a true Pokemon experience in true AR.

The true inspiration for this post came after coming across an amazing post on medium by Jie Feng, entitled Creating the True Augmented Reality Pokemon Go. Jie, having written the post shortly after the game’s release, used Google’s Project Tango (now defunct) to create a map of the room using the aforementioned SLAM technique. With the map, he then found 3D models of Pikachu, the most-beloved Pokemon character, rigged a skeleton and animated them using mixamo.com, and imported all the assets into a Unity game to then run on the Project Tango hardware. It worked quite well!

Making the Better Game

The year is 2019, and AR solutions are cheaper and more accessible than ever before. In mobile app development, there are currently three bell-whether platforms that developers love to use: Apple’s ARKit, Google’s ARCore, and the cross-platform Vuforia Engine. Having previously developed using the Vuforia Engine, I decided I would go with a Unity/Vuforia combo to develop this app.

The first thing to do in a project such as this is collect our assets — the things in the game that we wish to interact with. For this project, all we really need is a Pokemon (Pikachu) and a Pokeball. We’ll create .fbx animations for Pikachu with mixamo.com. Mixamo is an online tool that allows you to import any 3D humanoid and have it’s skeleton auto-riggd. From there, it has a database of hundreds of animations that work on the joints of any model. Stay tuned for my next post to see me animate a 3D model of myself to dance salsa!

Example of Pikachu animated in Mixamo

From here, we will import the assets into Unity and set idle and walk transitions for Pikachu. We’ll need to add some code for moving Pikachu around the AR game world. We will also need to add code for how the player (you) interacts with the pokeball. And finally we need code for how the Pokeball interacts with the Pokemon. Each of these code snippets is an interesting creative task in and of itself, and could be gone about in several different ways.

For moving Pikachu in the game world, it’s best to make it simple: I chose to do 2 seconds of idling, and 2 seconds of walking in a random direction.

How should the pokeball move in relation to the player? It would be easiest to simply place the pokeball a few inches in front of the player at all times. When the player wants to throw it, simply touch the screen to do so.

Let’s think how we want the Pokemon-catching process to work. In the games and the anime, the player throws a Pokeball, it bounces off of the Pokemon, and may or may not (depending on how strong and/or obstinate the Pokemon is) catch the Pokemon. To achieve this same effect, we will need two concentric sphere colliders around Pikachu’s head. We want only the larger sphere collider to act as a trigger here. The way this works is: player throws the ball, the ball hits the Pokemon at the smaller sphere collider, the ball bounces away normal to the collider, the ball exits the larger collider and triggers a function to see if the Pokemon should be caught.

The finished product is shown below — I hope you enjoy!

--

--