DANCE DANCE EVOLUTION

Merritt Moore is a quantum physicist with a Ph.D. in atomic and laser physics from Oxford University. She teaches creative robotics at New York University-Abu Dhabi. Moore is also a ballerina who has performed with world-class dance companies, including Zurich Ballet, Norwegian National Ballet and Boston Ballet.

In the intersection of the Venn diagram of Moore’s seemingly disparate professional pursuits is her passion for dancing with cobots, industrial robots that can work alongside humans in the same space.


“Sometimes creativity is just merging ideas in different ways.”

Dr. Merritt Moore


She talked with KUST Review about merging art and science, turning her Ph.D. project into interpretive dance for a contest and a new ambition that surfaced after she appeared on a grueling BBC reality series.

LISTEN TO THE DEEP DIVE

| QUESTION: You’re a ballet dancer and a physicist. That’s an unusual mix. Can you talk about how that came about?

I started dancing at 12 or 13, but was told I would never make it professionally. So I went to Harvard to study physics. But when I was there I still had this love for dance and auditioned like crazy and took a year off to dance.

When I was working on my Ph.D. with Oxford I danced with the English National Ballet. Then the pandemic hit and I had a residency at Harvard University’s ArtLab.

CAPTION: PHOTOS IMAGE: Courtesy Merritt Moore

My interest was piqued by AI in terms of how it could enhance our creativity: Sometimes creativity is just merging ideas in different ways. I couldn’t dance with humans, but robots couldn’t get COVID. A robot company generously lent me a robot.

I created more and more video content and was invited to perform live. It opened the doors to more questions and possibilities.

| Q: You’ve talked before about how physics helped you be a better dancer. Can you explain more?

Because I couldn’t be in the dance studio much because I was in physics classes all day, I really used the power of visualizing at night and would visualize doing the ballet moves. But at the same time I was understanding inertia and torque and friction and how your arms can slow you down or project motion.

IMAGE: Freepik
Dance your Ph.D.

Dance Your Ph.D. since 2008 has encouraged scientists to explain their Ph.D. dissertations through interpretative dance. Read more›››

Winners get modest cash prizes and, naturally, bragging rights.

The 2020 overall winners were a trio of students from the University of Helsinki who used dance, rap and a wardrobe of white, short-sleeved button-down shirts to explain their research into computation study of molecular clusters. The 2022 and 2023 winners used dance to explain the electroporation of yeast cells and nanoMOFs.

The contest is sponsored by the American Association for the Advancement of Science, Science magazine and artificial intelligence company Primer.‹‹‹ Read less

(I was) visualizing the angle I’d need on take-off to get the highest leap. It’s using physics to maximize the least effort in a way. I could almost release and let physics do as much as possible. It also helped me get out of my head.

| Q: Has dance helped you be a better physicist?

Dance helped because I think there’s a huge importance in mind-body connection. Dance opened up so much passion.

For the Dance Your Ph.D. contest I created a dance called “EnTANGOed” (about the spontaneous parametric down-conversion equation). Everything became a metaphor. It made me think conceptually about the equation. (As scientists) we’re taught to memorize and regurgitate information. But it’s often missing something.

Einstein imagined himself as a photon or a light beam. So many breakthroughs happened outside the lab. It was a realization that there’s this unsaid pressure that a good physicist’s head is in the textbooks. But dance helps understand physically what’s going on.

| Q: You engage in youth outreach to encourage kids in STEM and founded the group Science-Art-Sisters to encourage girls to think about science in a creative way. How do students respond?

I’m always surprised by how many are so hungry for it. During the pandemic I created Zoom calls with SciArtists from around the world. I was expecting 40. There were about 300. It was a breath of fresh air. If I could squeeze in extra hours in my day I’d do it again.

CAPTION: PHOTO IMAGE: Courtesy of Merritt Moore

| Q: You participated in the U.K. reality series “Astronauts: Do You Have What It Takes?” and U.S. competition show “America’s Got Talent.” Which was more nerve-wracking, helicopter training or facing Simon Cowell?

The astronaut one was definitely more nerve-wracking in the sense that they take away your phone, they take away your computer. I had no idea what was coming up next. The unknown made it more nerve-wracking than anything else. (It was) all day every day.

The stuff they don’t show: Anytime we were waiting, we were having to do IQ tests, EQ tests. We were constantly miked up and filmed. It was really intense. It was the best experience of my life but also, yeah, really intense.

| Q: You have also talked about your hopes to become an astronaut and dance on the moon. How do you envision that would look?

I think that the weightlessness is the ethereal aspect of it. On those levels it would be so incredible. I would also love to explore what’s the new language up there. What’s the new language of dance? How do we create or optimize it?

IMAGE: Freepik
Exoskeleton crew

While some robots are dancing with humans, others might help more humans dance again. Read more›››

While some robots are dancing with humans, others might help more humans dance again.

“These are really great problems and interesting challenges to be solved,” says Lakmal Seneviratne, KUCARS’ founding director.

Taking up those challenges: Irfan Hussain, a KU robotics professor researching variable stiffness actuators (VSA), which mimic human muscles that become stiff or soft depending on the task. For tasks that require accuracy, like throwing a ball or writing, the muscles become stiff, while for tasks that require safety, like physically interacting with humans, the muscles become soft, he says.

Hussain is working on a VSA device that uses bioinspired systems to create joints that can become stiff or soft as needed. It’s a robotic exoskeleton that people who have had a stroke could wear on their legs. The device, funded by Emirati investment fund Mubadala, could aid rehabilitation by mimicking the function of a knee joint, Hussain says. The same principle would go into building soft robotic hands that might help stroke patients safely grasp objects, Hussain adds. ‹‹‹ Read less

| Q: Did you want to become an astronaut before the BBC series, or did it jump-start a new ambition?

It definitely launched a new ambition. It’s not exactly a career that career fairs talk about.

| Q: You frequently dance with an industrial robot arm that you program. Are you interested in choreographing dances with other kinds of robots or is there something about the robot arm specifically that speaks to you artistically?

I’d love to explore so many different (kinds). The more robots the better. The more expertise the better. I’d like to dance with the massive ones. That would be super interesting. It’s just complicated to get access.

| Q: How do you envision AI and robotics will contribute to the arts in the future?

I think (robots are) an incredible tool that we can use for human expression. People get worried: Are you going to replace human dancers? No, that will never happen.

Painters got worried when we invented cameras 200 years ago. Painting is still valued, but photography is now an art. You can see a photographer’s work and you can see a human dignity to it. I think the same will happen with robot dancers.

IMAGE: Freepik
ONE GIANT LEAP FOR ART

Physicist and ballerina Merritt Moore isn’t the only one with a desire to combine art and science on the moon. Read more›››

Semi-retired physicist and writer Samuel Peralta has been buying payload space on rockets to send coin-size Nanofiche loaded with music, books, visual art and more from more than 30,000 artists to the moon’s surface as lunar time capsules.

Canadian Heather Horton is one of the contributors to the project, called the Lunar Codex. “Every time I look at the moon, for the rest of my life, it will be different,” she tells the Guardian.

“I think what we have done here is the most global, the most diverse, the most expansive project,” Peralta says. “I sometimes think of the Lunar Codex as performance art,” Peralta adds. “This is the greatest performance art of my life!”‹‹‹ Read less

With AI, this is where it will get a little blurry and it depends how we legally start thinking about it. AI brings together a lot of peoples’ different work. It still needs human expertise to curate it well.

| Q: When you’re choreographing a dance with the robot arm, do you start with the human’s movements or the robot’s? What are the limiting factors?

I love that I can change the “formula” each time. Sometimes I start with human movement, sometimes I start with the robot’s movement.

Limiting factors are that the robot does not have arms or legs, so it’s always a puzzle to figure out what type of movements will “read.” The speed is sometimes an issue because if it is too fast, there is a risk it will fall over. There are limits to how much it can rotate (but I’m much less flexible than the robot).

| Q: What do you hope your audiences take away from your performances?

Audience members have mentioned that they never imagined a dance with a robot could be so moving. I always hope that audience members leave deeply moved and spirits lifted.

I want to show the blend of technology and human emotion, pushing the boundaries of what’s perceived as traditional art. My hope is that audiences leave not only moved by the beauty of the unexpected partnership but also inspired by the possibilities that arise when we merge diverse disciplines.

CAPTURING STYLE

In a game of chess, the outsider often gauges the quality of the gameplay by the entertainment and surprise factor offered by the players’ choice of moves. A good chess match is a mind-bending dance of strategy and power: a true spectator sport.

The advent of artificial intelligence (AI) — specifically, software chess engines — seems to have painted this vibrant tableau in shades of monotony.

LISTEN TO THE DEEP DIVE

After chess world champion Garry Kasparov’s defeat against IBM’s Deep Blue in the late ‘90s, human chess players have grudgingly accepted that chess engines can beat them. However, they found solace in the fact that chess engines were utilitarian in their style: A classic chess engine may well consider tens of millions of alternative moves per second, but its playing style — especially with a limited lookahead — is boring. Unadorned and calculated. There’s none of the dynamism or creativity. None of the humanness. Developers did add in some randomness to introduce a semblance of unpredictability, but this resulted in predictable sequences of generally good moves interspersed with occasional mistakes.

As the AI meticulously generates all possible move sequences, it assigns an evaluation score to each resulting game state. This score takes into account the relative power balance between the players after a move. For example, if white gains a pawn, the score is +1, but if black gains a knight, the score for white is -3. More sophisticated engines incorporate positional information, such as the location of the pieces on the board and even factors like piece mobility and the safety of the king.

The addition of randomness forces chess engines to choose a move sequence at random from those with similar scores, or even occasionally play a randomly generated move, just to spice things up.

But these chess engines still spit out long sequences of unimaginative good moves, peppered with the odd bad one.

And, as human players know all too well, blind randomization is unlikely to land a player on a winning streak.


CAPTION: ERNESTO DAMIANI is senior director of the Robotics and Intelligent Systems Institute at Khalifa University

The subsequent generation of chess programs, powered by artificial neural networks (ANNs), learn from gameplay examples rather than predetermined formulas. These neural networks encode a binary representation of each position on the board, piece type and player color.

The outputs are the evaluation function values, leading to rapid and unpredictable gameplay as the bulk of computations are conducted during training, not live games. Developers use millions of online chess games played between humans or by humans against machines to set up training data. As the outcome of each game is known, it’s possible to more finely tune the models, selecting the best move for each scenario. Once trained, the ANN can be used by the chess engine to evaluate quickly and efficiently the score for each possible move — ANN-based chess engines become even more positional. And boring.

Google’s AlphaZero changed the landscape. It implements a new ANN-based approach that can be trained to play not just chess, but other board games. Given a game state, the AlphaZero engine computes a policy that maps the game state to the probability distribution of making each of the possible moves. In chess, that’s 4,672 possible moves — for white’s first move. For human players, this is ridiculous: There are 20 legal moves for white’s first move. And that’s the point.

DESIGN & PROMPTS: : Anas Albounni, KUST Review IMAGES: AI Generated, KUST Review.

AlphaZero includes all sorts of moves that are illegal, like selecting empty squares, selecting opponent’s pieces, making knight moves for rooks, or making long diagonal moves for pawns. It also includes moves that pass through other blocking pieces.

During training, nothing is learned or imposed about avoiding non-valid moves. The engine just post-processes the ANN output, filtering out illegal or impossible moves by setting their effective probability to zero. Then it re-normalizes the probabilities across the remaining valid moves. A philosopher could argue that this engine is devoid of ethics as it does not distinguish illegal from impossible, but the resulting ANN structure is simpler than one expressing only valid moves. On a modern processer, AlphaZero needs just a few tens of milliseconds to make a move.

This speed enabled AlphaZero to play against itself in millions of games, completing its training with reinforcement learning, which privileges moves that lie on a sequence that led to victory in the past. Of course, to know whether a move is on a winning sequence, one must complete the game, so AlphaZero reinforcement was performed by playing “fast and dumb,” i.e., using a very shallow search depth. Playing dumb in the reinforcement training phase maximizes the number of games that end in victories and defeats rather than in uninformative draws. As a result of this, AlphaZero considers fewer positions than the algorithmic chess engines of the past.


A classic chess engine may well consider tens of millions of alternative moves per second, but its playing style is boring.

AlphaZero and successors like Deep Chess have a distinctive style, steering away from merely seeking positional or material advantage. Their style is alien — best likened to atonal music: difficult to appreciate for anyone but the chess elite, and certainly useless for the average human amateur to learn or improve their game.

It is interesting that we humans still describe an intelligent chess engine’s style in positional terms.

Like the ‘90s Deep Blue victory, today’s post-AlphaZero scenario highlights some general problems that we will have to solve to be able to work together with the super-human AI engines of the future. AI decision-making must be intelligible to humans for us to accept its decisions. This interpretability needs to be wired into the AI training. Plus, interacting with humans is a crucial step for AI engine evolution: Playing against all possible competitors makes them stronger than any human can individually hope to become.

The game of chess, always a metaphor for life, suggests that controlling the evolution of future AI engines may become more akin to taming a tiger than training a pet.

Ernesto Damiani is senior director of the Robotics and Intelligent Systems Institute at Khalifa University.

DATA to delivery

Welcome to Industry 4.0, considered by many experts to be the fourth industrial revolution. Artificial intelligence and data analytics are a big part of it and are already changing how supply chains work. Here are just some of the ways they make getting a product from the manufacturer to your home cheaper and more efficient.

IN THE FACTORY

Generative design: An algorithm receives design parameters (such as cost and information on available materials) and generates thousands of options to find the best one.

LISTEN TO THE DEEP DIVE

Order management: AIs handle complicated order information from multiple channels.

Quality control: Sensors inspect products for defects.

Predictive maintenance: AI monitors systems and machines for early signs something is about to break down, preventing expensive factory shutdowns.

Compliance management: AI manages the red tape when the same product is sold in different markets with different regulations.

Customization: AI may be used to create such customized orders as bespoke suits and made-to-order shoes. And in a process called “reshoring” or “nearshoring,” products made far away can be customized closer to the sale point at the last minute.

IN THE WAREHOUSE

Stocking: Digital cameras monitor inventory levels and AI robots pick, sort and pack products.

Finding damaged packages: Machine learning models scan and analyze images to spot damaged objects.

Helping workers with wearable technology: Smart glasses “read” barcodes. Natural language processing helps humans work hands-free to pick items more safely.

THROUGHOUT THE PROCESS

Supply chain visibility: Internet of Things (IoT) devices provide instant information about such conditions as the location and temperature of shipments. Businesses can spot bottlenecks, manage disruptions in real time and make data-driven decisions.

Collaborative supply chains: Multiple companies use data and analytics to work together to plan and execute supply chain operations. The cooperative approach allows the companies to serve similar customers or achieve a common goal.

DELIVERIES

Optimal routes: Vehicle routing algorithms (without problems) use such factors as capacity, delivery priorities and time windows to plot the most efficient routes.

Real-time conditions: AI can monitor weather, traffic and other conditions to reroute as necessary.

Autonomous vehicles: Truck platooning technology can permit a group of vehicles to operate extremely closely, reducing wind resistance and decreasing fuel consumption for transportation between factory and warehouse or retailer. Smaller vehicles will be used for deliveries. Algorithms optimize routes while AI helps vehicles avoid collisions.

IN SEARCH OF THE METAVERSE

My office isn’t the most inspiring. It’s not bad, per se, but it’s not the peaceful lakeshore cabin conducive to creative thought and productivity that I’d like. If only it were socially acceptable to don my virtual reality (VR) headset and immerse myself in a futuristic cityscape or tropical haven and get all my work done. I want to pretend I’m floating among the stars on a spacecraft while replying to emails and writing my stories.

LISTEN TO THE DEEP DIVE

Jamie Gilpin, CMO at social media management tool Sprout Social, tells me that what I actually want is the metaverse. Sprout is one of many companies that have transitioned to a remote-first approach for its workers.

“Going to work in the metaverse may sound far-fetched but it may hold the answer to engaging workers in a virtual workspace. If your dream workspace is a beach, you might run into issues with sand getting into your keyboard,” she says. “The metaverse makes it possible to work wherever you want, without the limitations of the space. Allowing yourself to work in the environment where you feel most productive can yield incredible results.”

I had been thinking of VR, plain and simple. Is Gilpin right in saying the metaverse is the answer? What even is the metaverse?

As analysts for McKinsey and Co. wrote in a 2022 think piece, “if you’ve ever done a Google search for the term ‘metaverse,’ you’re not alone.”
Who hasn’t heard of the metaverse?

“The metaverse [was] the buzzword of 2022 in the same way that NFT was the buzzword of 2021,” says QuHarrison Terry, author of “The Metaverse Handbook: Innovating for the Internet’s Next Tectonic Shift.” “The metaverse is a fictional place imagined long before our current consumer-tech obsessions that has manifested into real progress. While the metaverse is far from a finished destination, there are thousands of people building it every second of every day.”

Herbert B. Dixon Jr. retired from the Washington, D.C., Superior Court in 2014. Before his retirement, he was overseeing the U.S. courthouse’s most modern prototype courtroom: high-def TV screens and all. Now, he’s a regular contributor to the American Bar Association’s Judges’ Journal and wrote in 2023: “The metaverse is a rapidly evolving idea. Describing the metaverse in 2023 is akin to explaining air or space travel to residents of the horse and buggy era. Every year, we see new technological advancements that a decade before would have seemed like science fiction.


“The metaverse makes it possible to work wherever you want, without the limitations of the space.”

Jamie Gilpin, CMO at Sprout Social

“The metaverse has been referred to as the three-dimensional internet and the future of the internet. My description of the future metaverse involves a digital universe (which may be real-world or imagined images) that your avatar enters to interact with other avatars.”

I don’t necessarily want an online representation of myself; I just want to pretend I’m working somewhere inspiring and quiet. But should I want to remain in my beautiful digital workspace, I’ll need an avatar to collaborate with my colleagues. They need a visible object in their digital environment that they can call “Jade” and I’ll need their avatars too. Yes, OK, online meeting platforms exist and I can change my background there and pretend I’m somewhere exotic but I want full immersion here.

Mariapina Trunfio, associate professor of economics and business management at the University of Naples, says the metaverse “defines a collective, persistent and interactive parallel reality created by synthesizing virtual worlds where people can use personal avatars to work, play and communicate with each other.”

In her 2022 Virtual Worlds paper, Trunfio explains that virtual technologies enhance the perceived immersion with the character realness of the avatars and residents: “Usually networked and situated with intelligent agents, they allow users to interact with virtual objects and intelligent agents freely, and to communicate with each other. In multiple forms, these worlds can be experienced synchronously and persistently by an unlimited number of users.”

I like the concepts of persistence and perceived immersion in Trunfio’s definition.

The McKinsey think piece also highlights that the metaverse means different things to different people:

“Some believe it’s a digital playground for friends. Others think it has the potential to be a commercial space for companies and customers. We believe both interpretations are correct. We believe the metaverse is best characterized as an evolution of today’s internet — something we are deeply immersed in, rather than something we primarily look at.”

In other words, as per the consultancy group’s working definition: “The metaverse is the emerging 3D-enabled digital space that uses virtual reality, augmented reality, and other advanced internet and semiconductor technology to allow people to have lifelike personal and business experiences online.”

ACCESS POINTS

To access the metaverse, says former-judge Dixon, the user needs “a computer programmed to access the computer-generated environment, a head-mounted visual display or goggles to see the virtual environment, an audio headset, and hand- and body-tracking, motion-detecting controllers and sensors to provide a sense of touch and feel while traveling within the environment.”

Ernesto Damiani is the senior director of the Robotics and Intelligent Systems Institute and director of the Center for Cyber Physical Systems at Khalifa University. His definition of the metaverse focuses the most on the technology needed to access the metaverse: “The metaverse is a digital, virtual space that humans wearing haptic interfaces (like helmets, gloves and visors) can enter and roam by projecting their presence as avatars. The metaverse puts together virtual reality, augmented reality and low-latency multi-party communication technology to allow people to have lifelike interactive experiences through their avatars.”

GRAPHICS: Abjad Design

I own a (VR) headset. I mostly use it for gaming. The virtual reality offers me that escape from the real world — again, picture my peaceful and inspiring work-environment goals. This total immersion isn’t the only feature of the metaverse though, and it’s not entirely practical for going about your everyday life. Enter augmented reality (AR).

Leslie Shannon likes the AR side of things. She authored “Interconnected Realities: How the Metaverse Will Transform Our Relationship With Technology Forever.” For her, the metaverse is a partly or fully digital experience that brings together people, places and information in real time in a way that transcends that which is possible in the physical world alone. She wants the metaverse to solve our problems — to be useful, not just entertaining.

“The problem is that smartphones and computers have done too well at solving the problem of delivering information and entertainment to us, exactly when and where we want it. To get this spectacular convenience, we’re prepared to pay a surprisingly high cost in terms of our connection to the people, places and things physically around us, and it’s a cost that we’re paying quite thoughtlessly today. You can probably name an incident in your own life within just the past week in which looking at a screen, rather than being present in your immediate surroundings, created a situation that caught you out socially, or made you neglect someone, or was even potentially dangerous. We’re all complicit in this one.”

How could an immersive digital world be the answer, Shannon asks. It’s not. But: “If we start thinking about a spectrum of experience, in which the far-left-hand side is 100 percent physical experiences, and the far-right-hand side is 100 percent digital experiences, then there also exists a middle point that is 50 percent physical and 50 percent digital, and sliding proportions of digital/physical mixes on either side of that middle point.”

Shannon says it’s the digital/physical mixes that deserve our attention. She calls this “interconnected realities.”

IMAGE: Abjad Design
Making the ‘metaversity’

By: Suzanne Condie Lambert

Khalifa University thinks the metaverse will be vital to the way students learn in the future. That’s why it teamed with Microsoft UAE and Hevolus Innovation for the 2023 Metaversity Hackathon, inviting student teams to create metaverse classrooms to remove physical barriers, making immersive, engaging and collaborative experiences inclusive and accessible. “One day we will have a university that is fully in the metaverse,” says Dr. Arif Sultan Al Hammadi, Khalifa’s executive vice president and KUST Review’s editor-in-chief. “Students will get the best education in the world wherever they are.” Read more›››

KU wants to be in the vanguard, and the hackathon, he adds, is a first step to getting there. Higher institutions would benefit as well, requiring fewer physical resources. Al Hammadi points to the example of medical school cadavers, which are expensive and may pose ethical concerns.

Schools are already using interactive 2D screens to reduce the number of cadavers required to teach anatomy, he says. A 3D metaverse could be the next leap forward. There are downsides, Al Hammadi says. Cheating is harder to detect. The physical experience of labs and experiments can’t yet be fully replicated. And distance learning doesn’t offer the same social life as on-campus classes.

But Al Hammadi says that as models improve, students will eventually be able to get much of the same experience in the metaverse. Hadi Otrok, a KU professor of electrical engineering and computer science, sees promise especially in using avatars to free instructors from small tasks, like running tutorials. “The challenge will be,” he says, “how to get the students … engaged with you instead of on the phone.”

It will take courage to take these ideas and create a fully interactive online experience, Al Hammadi says, suggesting that a potential “metaversity” could start with just one degree to prove the concept. And Khalifa University, he says, wants to be on the front end of imagining that future. ‹‹‹ Read less

“This concept of the metaverse is a world in which we can have the compelling, fascinating, relevant content that we currently access on screens, but integrated visually into our physical world in a way that enhances our lives, rather than removing us from them. This concept of the metaverse imagines the digital and physical aspects being incorporated with each other on a constantly sliding scale, so that sometimes we are fully immersed in a digital world, when that serves the purpose of the moment, but it is also possible to spend significant time fully immersed only in the physical world.

“This metaverse of interconnected realities will be a place where we combine digital information or entertainment from the world of the internet with our physical surroundings so that we can be more efficient, more informed, more delighted and more aware than we are today. A simple example of this enhanced future might be a sensor in my oven that connects with my AR glasses and, when the oven is on, displays its current temperature in a visual digital overlay when my gaze lingers on my oven for more than one or two seconds -– useful when I’m on the other side of the kitchen.”

Are we talking about a heads-up display (HUD) fixed permanently in my vision? I’d quite like that. I wear glasses anyway. It would be so helpful if people in real life had little tags above their heads to remind me of their names — facial recognition in VR land. Or a mini-map in the corner of my field of view so I’d never get lost again, video game-style.

After all, HUDs aren’t new. In aviation, they date to the end of the Second World War when rudimentary systems were installed in a few military aircraft.

The modern-day fighter pilot helmet boasts an impressive HUD, and Iron Man had one too. Granted, Iron Man belongs to the realm of fiction, but plenty of technology emerged from the minds of creators and novelists — including the term “metaverse.”

“The term ‘metaverse’ was coined by author Neal Stephenson in his 1992 novel ‘Snow Crash,’” says Matthew Ball, author of “Metaverse and How It Will Revolutionize Everything.” “For all its influence, Stephenson’s book provided no specific definition of the metaverse, but what he described was a persistent virtual world that reached, interacted with, and affected nearly every part of human existence.”

There’s that persistence again.

The “affecting nearly every part of human existence” thing I’m not so keen on.

EVERYONE EVERYWHERE ALL AT ONCE?

“The metaverse is a vast, immersive virtual world simultaneously accessible by millions of people through highly customizable avatars and powerful experience creation tools integrated with the offline world through its virtual economy and external technology,” Wagner James Au says in his book “Making a Metaverse That Matters: From Snow Crash and Second Life to a Virtual World Worth Fighting For.” He also says, however, that the metaverse is not for everyone:

“Chances are you’ve seen more than several tech evangelists across various media outlets insist that we’ll all soon be in the metaverse. I can tell you from painful — but also amusing — experience that this is unlikely ever to be the case. And, no, you probably won’t wear a VR headset on a regular basis either.

That said, it’s also safe to say at least one in four people with Internet connectivity will be part of the metaverse on some level. At a very conservative estimate, over half a billion people worldwide already use one or more variations of a metaverse platform now, from Minecraft and Roblox to Fortnite, VRChat and Second Life. That’s about 1 in 10 of the 5 billion people across the planet who use the internet.”

The majority of Au’s examples are games. Gaming companies are the pioneers in the metaverse space, well known as early adopters and prototype metaverse builders. Minecraft and Fortnite offer virtual worlds where players meet as avatars to play games and chat. They offer in-game payment systems and in-game assets that travel with players across platforms: from PC to console to mobile. They are also social spaces where gamers forge online relationships and communities.

IMAGE: Abjad Design

This gaming-world innovation correlates closely with many working definitions I found of the metaverse concept. Indeed, Ian Khan, author of “Metaverse for Dummies,” says the metaverse refers to virtual reality-based online worlds and notes that many of these worlds are gaming environments or online games. “Others function more as online virtual places where you can do other activities such as meet people, learn new things or simply hang out. And the types of virtual worlds you can find in the metaverse continue to expand and are likely to continue to evolve.”

Many of the experts I found, however, wouldn’t say we have a metaverse yet.

Dixon says the metaverse does not yet exist, but “its ultimate scope is constrained only be the limits of human imagination.”

Aakansha Saxena, assistant professor at the School of Information Technology, AI and Cyber Security, Rashtriya Raksha University, calls the metaverse a “concept”: “It can be understood as an infinite universe where communities of people can collaborate and enjoy the mechanisms of augmented reality, virtual reality, extended reality, online life and much more.”

That sounds like many of these games to me.

Khaled Salah, professor of electrical engineering and computer science at Khalifa University, throws a spanner in the works with his definition, saying: “A metaverse is an immersive and 3D virtual world in which people can interact through avatars to carry out their daily interactions, unlocking the potential to communicate, transact and experience new opportunities on a global scale.”

I’m struck by his use of article: “a metaverse” not “the metaverse.” Of all the people I asked, books I read and research articles I consulted, Salah was the only person to raise the question of multiple metaverses. Does each gaming platform or each individual game have its own metaverse?

And if each platform has its own, how can we move seamlessly between them all?

Maybe Mark Zuckerberg, CEO of Meta, has the answer. He said on the Lex Fridman Podcast that the metaverse is not a construct of connected virtual places:

“Instead, the metaverse is the point of time when we do and spend large portions of our everyday digital work and leisure in immersive 3D environments with VR and AR glasses.”

Meta, of course, used to be Facebook, and the company changed its name in 2021 to highlight its new direction. The company has since announced a U.S.$2.5 million investment supporting independent academic research across Europe into metaverse technologies because “since no one company will own and operate the metaverse, this will require collaboration and cooperation.”

Terry, author of “The Metaverse Handbook,” sums it up:

“Let me clear the air and first tell you what the metaverse is not. The metaverse is not a single technology. It’s not just a place we’ll visit in VR. It’s not something that can be created and claimed by the next Bezos or Gates. In fact, the metaverse is about as boundless and unownable as the internet, if not more so. Sure, there are entities that have contributed more to the internet than others. Of course, there are innovations that steered the course of the internet and influenced the experience of the web. But we didn’t wake up one day with the internet we see now. It was an ever-evolving thing.”

WHERE ARE WE GOING WITH THIS?

“The metaverse in the early 2020s is the equivalent of the mid-1990s in the development of the internet: Many people are talking about it, a few people are already building it, but no one can really define what it is, or what it will be able to do for us, or even if it will be relevant to anyone at all once it’s here,” Shannon says.

Khan, author of “Metaverse for Dummies,” agrees that in terms of development, the metaverse today is where the internet was in the 1990s:

“The early internet was shaped by new ideas, technologies and ways of doing things. With the right investments, adoption and usage, the internet grew into the internet we know today. Similarly, the metaverse today provides an interesting place for many activities, but many of them are still in the early days of development. The investment and attention put into building the metaverse over the next five to ten years will determine what the metaverse ultimately becomes and the value it creates.”


“The metaverse becomes more real every time we replace a physical habit with a digital equivalent.”

QuHarrison Terry, author of The Metaverse Handbook: Innovating for the Internet’s Next Tectonic Shift

Per University of Naples’ Trunfio:

“The metaverse, like many innovations, is shrouded in mysticism and skepticism. If many believe it will be revolutionary and fully transform how people work, shop, socialize and play, others are skeptical, and see it as a fad. However, whether or not we think of the metaverse as a technological revolution, it is undeniable that the massive diffusion of this technology will impact on nearly all aspects of life and business in the next decade, allowing interaction in virtual and augmented spaces and a blend of both.”

Whether you’d say the metaverse is here already or well on its way, it’s clear that it’s the next big disruptor, the new place to be for all aspects of life.

After everything I’ve read and all the people I’ve spoken to, I think it’s funny that the definition of metaverse that resonates most with me is much more abstract than the very scientific approaches I’d usually turn to.

Shaan Puri, tech entrepreneur, posted a tweet in 2021 that sums it all up pretty nicely:

“The metaverse is the moment in time where our digital life is worth more to us than our physical life.”

Or as Terry puts it: “The metaverse is not just a place we’ll visit in VR. It is not a destination. The metaverse is a movement — a movement toward the digital-first livelihood we’ve slowly been adopting year over year, app by app. The metaverse becomes more real every time we replace a physical habit with a digital equivalent. We, the digital citizens of the internet, are manifesting the metaverse by trading time in the physical world for time online.”

I’m OK with this.

Innovation at the forefront

For the United Arab Emirates to continue to be the leader in its region and beyond in information and communications technology, it needs to invest in advanced intelligence ICT, next-generation networks (NGN) and NGN-enabled ICT applications and services.

This is why Khalifa University, with partners e& and BT Plc, created the Emirates ICT Innovation Center (EBTIC), supported by the Telecommunication & Digital Government Authority’s (TDRA) ICT Fund.

IMAGE: Khalifa University
Nawaf Almoosa

Nawaf Almoosa is an EECS faculty member at Khalifa University with a joint appointment as the director of the Emirates ICT Innovate Center (EBTIC). His research interests include high-performance heterogeneous computing and distributed optimization and control with applications to computing, telecommunication and robotic systems.

EBTIC aims to be an international center of excellence for applied artificial intelligence and intelligent ICT systems research and innovation, driven by strong industry-academia and government partnership promoting world-class research, technology transfer, research training and open innovation in areas of strategic importance for its founding partners and the UAE.

BENEFITS TO THE COMMUNITY

EBTIC collaborates with its partners and other UAE government entities, delivering more than 40 strategically important projects each year. EBTIC has strong capabilities in machine learning, optimization, natural language processing, cooperative intelligence and big data analytics, as well as network architectures and cybersecurity.

Much of what it delivers drives new revenue or cost-savings opportunities for its partners. For instance, more recent leading-edge projects provide intelligent building solutions, such as machinery-fault prediction, smart-metering analytics and Wi-Fi sensing. Also, power-usage forecasting and optimization helps companies significantly reduce their energy requirements, reducing costs and their carbon footprints.

Recently, EBTIC has been working closely with the Abu Dhabi Agriculture and Food Safety Authority (ADAFSA) to predict food-import levels and forecast local food-production quantities. This work helps give ADAFSA a clear understanding of food supply in and out of the UAE and helps them maintain a robust and safe food market.

EYES ON COMMERCE

EBTIC also aims to commercially exploit the most promising of its research projects by looking to spin out start-ups into the UAE and global economy. Among projects in development is 10Folds.

10Folds will be the region’s first machine-learning data-labeling solution provider for the Arabic market.

There is no Arabic-language labeling solution that considers the different dialects in the marketplace, creating a real problem for the development of AI solutions in the region.

To train a machine learning-based model to correctly identify Arabic requires humans to examine data and manually assign labels for the model to learn from. Arabic words take different meanings based on the dialect. With data tagging completed by Arabic speakers, 10Folds aims to guarantee quality assurance of tagging, leading to a more accurate machine-learning model being trained.

RESPONSE TO COVID

EBTIC leveraged its capabilities in machine learning to support the response to the COVID pandemic through the development of COVID spread models driven by digital infrastructure data, and applying accurate multilingual text analytics and natural language processing (NLP) techniques to social media to gauge the public discussion and sentiment about the ensuing pandemic.

Recently honored as the UAEs most inventive center at the Department of Economic Development’s Abu Dhabi Awards for Intellectual Property, EBTIC has more than 80 inventions patented or being filed, and 64 patents already granted. One major facet of EBTIC’s continued success is in its knowledge-transfer ambitions. EBTIC has so far trained more than 400 UAE students, including supervising 50 Ph.D. or Master of Science students, and trained over 300 UAE-based professionals in big data-related competencies.

This mix of achieving scientific excellence; tackling national and societal challenges; and building core AI skills in the UAE is central to EBTIC’s mission, and there is much more to come from our collaboration with industry, universities and governmental organizations that will further help the UAE cement its place as an ICT leader in the global economy.