This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies. Read our privacy policy

With AR, you're seeing through an image that’s being projected into your eye. It’s very hard to make that imagery look real
09

I can see clearly now

Dr. Robert Ramsey, CEO, Rain Technology

AR and VR face technical hurdles before they’re ready for prime-time viewing. Rain Technology is working to help these technologies show their true colors.

What is Rain Technology?

We’re a small, R&D-focused company of about 20 people, many of them with PhDs. We have a portfolio of about 700 patents related to displays and light control from displays. In particular, we’re working on switchable privacy displays in laptops and mobile devices; and on automotive passenger entertainment displays that prevent driver distraction. 

You’re active in AR/VR. What are you doing there?

We’ve come up with techniques to overcome some problems preventing broader use of AR/VR. Those problems stem from the widespread adoption of an inefficient optical design. Much of our work centers on AR, Augmented Reality.

From a technical standpoint, how does AR differ from VR?

With VR, you’re wearing a headset that generates an entirely virtual environment. You have good color and brightness because the display makes up your entire field of view, and you don’t need to account for the light in the environment around you, or any light coming through a transparent screen.

With A.R., you're seeing through an image that’s projected into your eye. You’re looking through a pair of clear glasses, onto which imagery is superimposed. It’s very hard to make that imagery look real.

What specific problems are you trying to solve?

One big issue concerns color. Most AR systems diffract light, meaning they bend it. Different colors of light bend at different angles. That distorts the colors of the images you see, so they look bad. To get around that problem, AR systems often use a single color, usually red or green. We want to give viewers a better experience by allowing them to see a full range of bright colors.

Second, AR has trouble producing high-definition imagery. People have gotten used to seeing nearly 4K picture quality, and AR can’t deliver it yet. The pictures are blurry and not very bright, with a limited visual field of view. They are typically only one color and show cartoonish scenes with little stick figures. This isn’t a big deal in industrial scenarios, but consumers won’t accept it. These constraints also limit AR’s uses in scenarios ranging from manufacturing to surgery.

Third, augmented reality has a narrow field of view—the area you see through the headset, glasses, or other optical device. You might look left and right, then realize that the image you should be seeing has been cut off.

What about the problem of uncomfortable headsets? Gamers can’t usually wear those things for more than about 15 minutes at a time.

Yes, comfort is an important aspect of the user experience. Current AR design forces users to wear heavy, hot eyewear because large batteries are needed to make images bright enough.

We’re mitigating that problem using anamorphic design. It gives you 10x the brightness provided by regular AR glasses, reducing eye fatigue and allowing people to use AR for longer periods of time.

How does it do that?

Expanding the image distorts it and degrades its quality. So, you can't read text, because it's blurry. You get only this tiny, narrow window where images are sharp and clear. As your eye moves left and right, the imagery gets blurry again, and you’ve got to continually re-focus your gaze. That causes eyestrain.

Anamorphic design prevents this. In the horizontal plane, it doesn’t expand the image at all, and in the vertical dimension, it expands only a bit. That dramatically widens the “comfort zone,” where objects appear clear and sharp, greatly reducing eye fatigue, headaches, and other negative side effects.

How does all of this relate to 5.5G?

Improved optics and user experience for AR and VR systems dramatically increase the utility of these devices for both consumers and enterprises, in applications from healthcare to manufacturing to defense. 

For example, if the anamorphic design can increase the average session time from 20 minutes to four hours, the limitations shift from the user (for example, fatigue) to the content. Limitations on content are mitigated by high network speed and fidelity. That’s where 5.5G comes in. 

AR systems need to push a lot of real-time information into the headset or other display. Unless you’re wearing a big computer on your back, that information is coming from the cloud, and streaming into your glasses.

So, you’ve got these computationally intensive things happening in the cloud, and the content needs to reach the display wirelessly, in real-time. To accomplish that, you need the high speed and minimal delay of 5.5G.

If you look at, say, Magic Leap or other companies that have made AR glasses, they have a huge tether running from the headset to something like a backpack, which holds a computer. The computer senses where you are in space and does all the computations that produce the A.R. imagery.

But if you could do all that in the cloud, and use Bluetooth, Wi-Fi, or 5.5G to get the imagery into your glasses, then the glasses would contain all necessary information. You could look at a mountain and know its name, how tall it is, what trees grow on it, and where the summit is located. All of that information would be superimposed in your field of view. And you wouldn’t need to lug around a computer on your back to get it.

How will this new AR technology be used?

In an industrial setting, there’s a good use case for logistics. Let’s say you work for UPS or FedEx, and you’re looking for a particular box on a conveyor belt. Your AR glasses could track this box for you. When it appears on the belt, your lenses will display a red arrow that points to the box you want.

The system knows where you are, it knows where the box is, and it knows where your eyes are looking. It’s this kind of real-time information on a heads-up display that the AR world provides, as opposed to the VR world.

What if you work in a regular office setting?

Glasses could provide heads-up information there as well. You might get a notification, a little pop-up that says, “Hey, you've got a meeting in 10 minutes.” Your AR glasses would have essentially become your mobile phone. 

Outside of work, the glasses could show you maps. You’re walking down the street, and the glasses tell you where to turn, or the name of that big building across the street. All this information would be coming to you, but you'd still see the real world. You wouldn’t be looking down at your phone.

Will these glasses eventually replace our phones?

Maybe. It’ll evolve. First, you'll have some basic information that starts to come into your visual field; as the infrastructure gets better, you'll start to get more. You might be able to turn pages by blinking or by physically touching the glasses. Haptics is big right now, so a feedback mechanism on glasses is quite possible.


Contact us! transform@huawei.com