Generally, there is a lot of projection mapping software such as heavyM and Madmapper which are designer-friendly. But in the practice, I realize they are not the most suitable. For instance, if the work needs only a suitable angle to show a short film, but for 4 days, and 7 hours per day, I can’t leave my laptop there for such a simple case for such a long time. Although MadMapper offers a solution named MiniMad which is a media box can supply MadMapper’s file, can help with the case, I don’t think it is valuable.

Let’s go to the point. After Effect can bring both simple and high-quality results in such cases.

In this tutorial, I used Adobe After Effect CC 2017 and a mini HD Media Box and a BenQ TH682ST Projector.

  1. Check the output of the media box and projector. Make sure the Media Box has the same resolution as the projector. In this case, my projector is Full HD 1920* 1080, so I have tried setting my media box to 1080i 50Hz, 1080i 60Hz, 1080p 50Hz , and finally, I used 1080p 60Hz, only because it works on the projector.
  2. Since I know my projector can run a 1920*1080 video, in After Effect, I create a 1920*1080 component to put everything I wanna show in the projection mapping inside.
  3. Last but most importantly, in After Effect, use Viewer-New Viewer to create a new window of the preview and drag it to the projector’s screen.
  4. Use the short-key command+\ to have the full-screen view.
  5. Now, it is ready to make the projection mapping. On the Viewer of the laptop’s screen, I only use the effect “Distort – Corner Pin” to adjust every element in the right position.
  6. After animating, EXPORT the animation by clicking Composition>Add to Render Queue. (A good video format that can easily be played back at good quality whilst being a small size (faster to play) is QuickTime with the H264 codec.)
  7. Finally, play your exported video full screen on the second screen (videoprojector) using a video player like QuickTime Player or VLC.

 

There is an ARKit demonstration named Portal created by Japanese developer Kei Wakizuka. By touching screen, the portal will appear in the real space in the screen with correct perspective and space distortion effect. Through the portal, there is a digital world, but is an immersive one because the audience can also enter the world, and the real world will be behind the portal.

Now there is a variety of similar projects, and most of them are named as Dokodemo Door which is a magical tool in the Japanese manga Doraemon. In the manga, by the Dokodemo Door, Doraemon and his friends can go to anywhere.

Most Dokodemo Door projects are based on SLAM technology, for example, ARKit, the reason is if the audience use Image-Recognization-based AR technology, it is not so easy to enter into the digital world because the audience has to hold the device aiming on the image which will be a terrible experience. But I will keep my experience on both technology, there must be more than one possibility  as Dokodemo Door.

My experiment is based on Vuforia and ARKit, here are two results, and you may see the difference of two design logics:

Feb-07-2018 21-31-11.gif

To create the negative space is the basic step of this kind of AR experience. Now, I only find one method using DepthMask shader. The idea is “a mask object using the Depth Mask shader. This object will be drawn just after regular opaque objects, and will prevent subsequent objects from being drawn behind it.”

1.Based on Vuforia

1.Components

  1. A target image, which is only necessary toImagine-Recognization-based AR, and it should be a little bit lower than the hole;
  2. A closed 3D object with a hole (I created in C4D) which is used as the entrance for audiences, in my example, it is a cylinder. DepthMask.shader should affect this closed space, to make it as an invisibility cloak to hide other component inside;
  3. An inner box with out top side, it will be the most important part, and may have better construction method if you wanna realize a virtual world, but which is not necessary in my example;
  4. A ball with ShowInside.shader.
未标题-1

2.Shader

1.DepthMask.shader

 Shader "Custom/DepthMask" {
  
     SubShader {
         // Render the mask after regular geometry, but before masked geometry and
         // transparent things.
  
         Tags {"Queue" = "Geometry-10" }
  
         // Don't draw in the RGBA channels; just the depth buffer
  
         ColorMask 0
         ZWrite On
  
         // Do nothing specific in the pass:
  
         Pass {}
     }
 }

2.ShowInside.shader, here is the documentation from Unity about how to cull front or back (normally, it is default to cull the back, so you can’t see anything inside a 3D object).

 Shader "Custom/ShowInside" {
     Properties{
         _Color("Main Color"Color) = (1,1,1,1)
         _MainTex("Base (RGB)"2D) = "white" {} 
     }
 
         SubShader{
         Tags"RenderType" = "Opaque" }  
         LOD 100
 
         Pass{
         Cull Front    
         Lighting Off
         SetTexture[_MainTex]{ combine texture }
         SetTexture[_MainTex]
             {
             ConstantColor[_Color]
             Combine Previous * Constant
             }
         }
     }
 }

2.Based on ARKit

Components

  1. 2 closed objects with holes in the same direction and with different sizes: the outside one should be a little bit larger than the inside one, as the outside one’s function is to hide the inside one by DepthMask.shader, and the skybox material should be apply on the inside one to create a scene with depth of field;
  2. A doorframe for the hole to prevent the edge from being too shape. It can be a real doorframe with high quality texture or a black hole with dynamic shader;
  3. A scene with several objects to create spatial sense of hierarchy.
未标题-1

There are great examples of mixing animation and graphics, by EYEJACK based on AR technology:

EYEJACK has also released an AR book includes these examples:

I feel it is not such perfect in this period, because of limits of devices and algorithm, but a pleasant way to use AR form in the future, with HMD as Google Glass and etc..

So the problem now is how to achieve this effect with present technology in an easy way that designers with basic coding background are able to do.

At first, I had two ideas:

  1. Using shader to make the background invisible.
  2. Using a video with alpha channel;

The first way works well, I learnt a lot from this tutorial, but the shortage of being tough to locate the video in the ideal place makes me think it might not be a good method. Here is the result:

Feb-04-2018 15-28-47.gif

1. Softwares

  1. macOS 10.13.2;
  2. Unity 2017.3;
  3. Vuforia SDK included in Unity’s installation package;
  4. After Effect CC2017.

2. Video

  1. In AE, use effect/keying/color range to make the background invisible, and export two videos: one with RGB channel and another with Alpha channel.Feb-04-2018 15-48-27.gif
  2. Create a new composition and put two videos together as this, then render it as mp4屏幕快照 2018-02-04 下午3.53.14.png

3. AR

  1. If you don’t know how to set up Vuforia’s ImageTarget environment, please follow this Vuforia official tutorial, it is quite basic and I don’t want to copy it here.
  2. Create a plane in Hierarchy, later it will be used as the platform of video playing, and drag it under the ImageTarget屏幕快照 2018-02-04 下午4.01.18.png
  3. Create a new Material in Project, and drag it on the plane, and also Add Component/Video Player on the plane, and import your video in Project and drag it into the Video Clip which is in the Video Player屏幕快照 2018-02-04 下午4.03.15.png
  4. Create a new Shader in Project, and replace all default codes with these, which I learnt from the mentioned tutorial:
         Shader "Custom/Transparent" {
     Properties {
         _Color ("Color"Color) = (1,1,1,1)
         _MainTex ("Albedo (RGB)"2D) = "white" {}
         _Glossiness ("Smoothness"Range(0,1)) = 0.5
         _Metallic ("Metallic"Range(0,1)) = 0.0
         _Num("Num",float) = 0.5 

    }

    SubShader {
         Tags { "Queue"="Transparent"  "RenderType"="Transparent"}
         LOD 200

        CGPROGRAM
     #pragma surface surf NoLighting alpha:auto
 
     fixed4 LightingNoLighting(SurfaceOutput s, fixed3 lightDir, fixed atten)
         {
             fixed4 c;
             c.rgb = s.Albedo;
             c.a = s.Alpha;
             return c;
         }
     float _Num;

        sampler2D _MainTex;

        struct Input {
             float2 uv_MainTex;
         };

        half _Glossiness;
         half _Metallic;
         fixed4 _Color;
         UNITY_INSTANCING_BUFFER_START(Props)
         UNITY_INSTANCING_BUFFER_END(Props)
     void surf (Input IN, inout SurfaceOutput o) 
         {
             o.Emission = tex2D(_MainTex, IN.uv_MainTex).rgb;            
             

             if (IN.uv_MainTex.x >= 0.5)
             {
                 o.Alpha = 0;
             }
             else
             {
                 o.Alpha = tex2D(_MainTex, float2(IN.uv_MainTex.x + 0.5, IN.uv_MainTex.y)).rgb;                        
             }

        }
         ENDCG
     }
     FallBack "Diffuse"
     }

5. Select the plane, and change the shader in the material part屏幕快照 2018-02-04 下午4.17.53.png

4.Test & Build

  1. If you have finished that Vuforia’s official tutorial, you should know how to play it with your webcam;
  2. If the webcam works well, so you can build it into your mobile device with File/Build and Run;
  3. It will automatically run the Xcode software, and this process you can learn from this Unity official tutorial.
  4. Done

The second way is much easier but it has a shortage that it can only work on apple’s devices because it use Apple ProRes 4444 which is only supported on OSX, check Unity’s official documentation about Video transparency support for more information.

1.Video

  1. It is the same workflow as the first method, in the After Effect you will have the result that video has an transparent background;屏幕快照 2018-02-04 下午6.35.50.png
  2. Then you have to export the video as QuickTime format with RGB+Alpha channels屏幕快照 2018-02-04 下午6.38.16.png
  3. On the same window, there is a button named Format Options, click it and in the new window you have to choose Apple ProRes 4444 as the Video Codec屏幕快照 2018-02-04 下午6.40.28.png
  4. Render it and you will have an Unity-Ready transparent video

2. AR

  1. Import it into Unity and in the Inspector click Keep Alpha and click Apply屏幕快照 2018-02-04 下午6.44.16.png
  2. Drag it into Video Clip of Video Player Component of Plane which has been mention in the first method, and create a new Material for the plane which shader should be selected as Transparent/VertexLit with Z屏幕快照 2018-02-04 下午6.47.56.png
  3. Done