Hello everyone.

The Mixed Reality Forums here are no longer being used or maintained.

There are a few other places we would like to direct you to for support, both from Microsoft and from the community.

The first way we want to connect with you is our mixed reality developer program, which you can sign up for at https://aka.ms/IWantMR.

For technical questions, please use Stack Overflow, and tag your questions using either hololens or windows-mixed-reality.

If you want to join in discussions, please do so in the HoloDevelopers Slack, which you can join by going to https://aka.ms/holodevelopers, or in our Microsoft Tech Communities forums at https://techcommunity.microsoft.com/t5/mixed-reality/ct-p/MicrosoftMixedReality.

And always feel free to hit us up on Twitter @MxdRealityDev.

Hololens Connect Four Development Underway

Hololens Connect Four development is underway. It's just going to be a learning project for me before moving onto larger projects (I'm in the field of higher ed), since I assume there would be legal ramifications with Hasbro or Milton Bradley or whoever (unless the game is in the public domain? - not sure).

The thought is that the board would be too small and far away (given the .85 meter limitation) to play on a table, so the game will sit on the floor of your room. The board is 1.7 meters wide and about 2 meters tall. Might need to scale that down, but I thought I'd go big for now and I'll need my hardware (I'm in wave 2) to give it a full test.

I've implemented the gameplay mechanics. Chip position is set with the keyboard and "resetting" the game board (letting the chips fall) is also handled by a key press. The intention is to use gaze input for the user to move the chip into position, use the tap gesture to make it fall into place, and use a voice command, "Reset board," to cause all the chips to fall out from the bottom and scatter all over your living room, where they will stay for several seconds before disappearing.

Waiting for Unity 5.4 to code gaze, gestures, voice commands, and to do the initial placement of the game board in the spatial map. And of course the emulator in Visual Studio.

No AI yet. The human user currently plays both red and black chips, which alternate between turns.

Below is a screenshot of the game board in play and some early code.

Answers

  • Great to see this! Keep us updated.

    It'll be fascinating to see popular boardgames/table games recreated in Mixed Reality.

    Battleships anyone?

  • Battleship would be perfect for AR, especially with some explosions and animations!

    It would be an interesting multi-player implementation where you could actually have both players looking at the same board, just with different ship configurations that they've placed in the set up stage. Or you could have them sit across the room from each other with the boards in the middle.

  • It's up and running in the emulator now. I've implemented gaze to place the player's chip and a tap gesture to drop the chip. The computer player plays the black chips, but it currently just chooses a non-full random column. Win conditions are programmed and a message appears when either player gets four in a row horizontally, vertically, or diagonally.

    I still need to implement:
    Voice commands for reset
    Initial game board placement
    AI
    Resize the game board

Sign In or Register to comment.