November 28
Done
👨‍💻

DevLog: AR Voice Chat

Building an interactive, augmented AR chat app using Flutter, Firebase, and Apple ARKit.
notion image
Picture by Alexander Pinker — December 4th, 2018

Problem with Current Voice Chat Systems

Since the beginning of this current pandemic we live in, we were continually looking for virtual alternatives, keeping us connected to friends and loved ones. Zoom’s stocks have increased sixfold, since the beginning of covid, alongside with high user increases at on Google Meet, Houseparty, Skype, and others. The virtual calling industry was booming, as the whole world began relying on its functionality.
There could not have been a better “chance” for companies like Zoom. However, even though the demand was as high as never before, companies didn’t try to innovate on the conventional idea of video-calling.
The problem that people saw, not being able to interact with users individually but only being able to address everybody in a room, made communication and productivity drop.
While at a regular meeting, partners could talk with their opponents about upcoming challenges or problems simultaneously; In zoom, only one person gets to speak at a time
. But not just from a business perspective, general meeting up’s with family ad friends just seem off. Not natural. So how can we make it more natural?
In Virtual Reality environments, such systems are already implemented. They are called proximity chat and imitate a real-world 3-dimensional space, where depending on your position, your chat volume to other people changes. So you can walk up to people, talk to them, and then leave off for someone else. Now not everybody can afford VR headrests which are required to run such applications, so I was thinking:
Would it be possible to create an augmented approximability chat in real-life environments that users can access using their everyday smartphones?

How this Standard could be Improved

The idea is that since everybody got a phone nowadays, you can use that to be your entry to your virtual space, being able to meet up for class, after school activities, or anything you can think of. Everybody can join a virtual room, appearing in an for them rendered and human-like created avatar. This would benefit all kinds of people, basically, anybody with a solid internet connection who wants to talk to somebody else, not in the same room. Studies show that humanlike interactions are better for our mental state, giving us the illusion to “interact” physically with people. During the current global Pandemic, depression rates climbed due to being locked in. This might help, not only those people right now, but for our normalized future as meetings and phone calls are daily procedures we all follow and will.

My Plans for the First Prototype

My first prototype for this ambitious project doesn’t feature too much. I will implement custom-styled objects, add plane detection and distance tracking. The second Sprint will feature the actual chat app, by connecting it to a Firebase backend, tracking player movements in a real-time database, as well as es their voice. In the beginning, I usually tend to underestimate different parts, which in this case did happen again.

The Technology I will be Using to Build this Project

Apple is encouraging developers to be building AR applications, including cameras, sensors, acceleration meters, and everything else needed for such applications in their devices. So that’s why I started making a prototype using Apple’s ARKit, as I just felt like it would be a great way to better my knowledge in VR.
Before I began, though, I had to choose a framework and the environment I want to work with. I am a UI designer and developer. Therefore, my favorite Framework is Flutter. A tool that builds upon the dart language, which can compile into native android iOS and Web code. Even though I was using Apple’s technology — their ARKit plugin — I decided not to use their native programming language, Swift, as it would limit me to only develop for iOS. The ARKit Plugin for Flutter I ended up using was pretty new to the market, it is really extraordinary to see how much work goes into developing such a system. Huge Shoutout to Oleksandr Leuschenko as he managed to transform the ARKit into a suitable Flutter plugin.
I will also talk about different issues I faced, helping all the people out there who also want to try out the Flutter ARKit Plugin and are stuck with the documentation.

Project Setup and Initialization

So with no further ado, I installed the plugin into my flutter environment and initialized a 3D object, being spawned at the location (0|0|0), which is the initial position when the so-called “ARKitSceneView” gets created. The object is shown: a 3D sphere.
notion image
import 'package:arkit_plugin/arkit_plugin.dart';
...class _CustomProjectState extends State<CustomProject> {   ARKitController arkitController;   @override   Widget build(BuildContext context) => Scaffold(      body: Container(         child: ARKitSceneView(            onARKitViewCreated: onARKitViewCreated,         ),      ),   );void onARKitViewCreated(ARKitController arkitController) {
this.arkitController = arkitController;
final node = ARKitNode(
        geometry: ARKitSphere(radius: 0.1), position: Vector3(0, 0,   -0.5));
this.arkitController.add(node);
  }
}
This wasn’t too complicated, so I figured it would be a good idea to style objects. In the “ARKitSceneView”, I initialized a controller keeping track of multiple objects and spawned a tube and rounded container. I positioned them not to be lying on to each other by only taking the current position and adding (x,y,z) coordinates to the calculation before adding the variable to the “position” tag. Till here, everything was going great!

Adding Physics and Plan Detection to Objects

Now, objects are still randomly placed at the initial location when the app gets opened. At least to me, that does not sound like a good idea if you are trying to create a chat system that is supposed to be as realistic as possiblefeaturing physics n’ gravity n’ stuff. So I added a function that keeps track of flat surfaces and then anchors them as planes. Once that was working, I assigned every plane an individual object and randomly Spawned on theirs.
notion image
   void _handleAddAnchor(ARKitAnchor anchor) {      if (anchor is ARKitPlaneAnchor) {      if (placing == true) {         //if button is pressed         _addPlane(arkitController, anchor);         print(anchor);         setState(() {         _numberOfAnchors++;         }
      );   }   void _addPlane(ARKitController controller, ARKitPlaneAnchor anchor){      anchorId = anchor.identifier;      if (node != null) {      }      node = ARKitNode(
         geometry: ARKitSphere(radius: 0.1), position: Vector3(0, 0,   -0.5));      controller.add(node, parentNodeName: anchor.nodeName);     }   }

Making objects individual

Physics and object placing can be as good as nothing else, and if you got only cubes to talk to, the whole thing would be useless. A
simple four-line replacement, calling an ARKitReferenceNote
(meaning real object) instead of an ARKitSphareNode, actually kept me busy for two days, as I wasn’t able to figure out a way to correctly import objects to the project.
The Problem: ARKit was officially built for Xcode
. And they had a way to import .dae items into the “models.scnassets” folder; however, flutter didn’t. I couldn’t transform my project into an Xcode ARKit compatible project required to import 3D objects, so I had no idea what to do.
The only possible step I’ve been coming up with was creating an Xcode projec
t → importing 3D model → Wrapping with Flutter project → Forbidding it to overwrite those files → import asset URL to project and hope for a random bird to be displayed. I still don’t know how or why, but it worked!
notion image
   node = ARKitReferenceNode(      url: 'models.scnassets/Diplo.dae',      scale: vector.Vector3.all(0.08),      position: position,      );   arkitController.add(node);}
Ok, objects are placeable. I have to admit, it was a cool experience, walking around in my room and seeing the whole thing working out way better than imagined. AR has come such a long way, you could walk up to those objects, shake your phone, walk through the whole house, and when I came back, ARKit still placed the avatars at the exact points I put them. That was due to some Tracking system changes, which I got from only tracking using the camera and accelerating a tilting meter.

Object Tracking for Distance Detection

The last thing for the first sprint — getting the relative position from every object to the currentUser (ViewCamera). That seemed easy at the beginning; however, it wasn’t. The problem: Even though I can get the relevant distance to an object (from the camera), that one is final, meaning it is not updating once assigned. So I could place objects and display their distance to me;however, If I then change my position, the distance indicator wouldn’t update. There was no such controller I could use. However, If I was able to get the current position (x,y,z) of every object, relative to (o,0,0), which was the place when you first load the app, this was now my center point, which I will use to make calculations simultaneously. For now: I got it working on displaying and safe object's location and distance in the app, and then have a string connecting each player to every other one, indicating their length, therefore, future volume.
notion image
void _onARTapHandler(ARKitTestResult point, ARKitController camera)      {   final position =
      vector.Vector3(point.worldTransform.getColumn(3).x,
      point.worldTransform.getColumn(3).y + 0,
      point.worldTransform.getColumn(3).z,    );print("Print: Object Posotion: $position");print("Distance to camera: ${point.distance}");...final lineNode = ARKitNode(geometry: line);   arkitController.add(lineNode);      final distance = _calculateDistanceBetweenPoints(position,
      lastPosition);      final point = _getMiddleVector(position, lastPosition);      _drawText(distance, point);   }   if (origin != null) {      final line = ARKitLine(      fromVector: origin + vector.Vector3(0, 0, 0),      toVector: position + vector.Vector3(0, 1.5, 0),      );   final lineNode = ARKitNode(geometry: line);      arkitController.add(lineNode);      }   lastPosition = position;   }   String _calculateDistanceBetweenPoints(vector.Vector3 A,
   vector.Vector3 B) {     final length = A.distanceTo(B);     return '${(length).toStringAsFixed(2)} Meter';   }   vector.Vector3 _getMiddleVector(vector.Vector3 A, vector.Vector3
   B) {      return vector.Vector3((A.x + B.x) / 2, (A.y + B.y) / 2, (A.z +
      B.z) / 2);}
notion image

Future Ideas and Implementations

Even though I can keep track of different objects and their position relative to other ones, I somehow cannot access the cameras’ location. There are multiple ways to approach this, but how exactly I will do it still needs to be figured out. Once this is working, I will try manipulating the dataset to move objects on their x- and y-axis. As I will keep track of the camera’s position (user) and other objects, I will create a real-time database using firebase. This will make it possible for me to join a room from two different phones and then move. Next to be the voice chatting system which then, depending on differences, gives out different volume percentages and finally makes it user-friendly by adding an auth system and the option to create own avatars. There are already great APIs out there, which I could use for that exact use case.
notion image
Copyright Pinscreen 2021 — open API for developers
for everybody interested in seeing the whole structure, I uploaded the source code to this Github repository: Github Gist
I’d love to connect to you on LinkedIn; if you liked this article, you are welcome to subscribe to my monthly newsletter, where I publish all my latest achievements.