Welcome to re:MARS 2022

Written by Amazon Staff
An image of the stage at re:MARS 2022. A man is walking onto the stage in an astronaut suit. In the background there is a large screen that says "Amazon re:MARS" with an image of a moon and space.
Machine learning. Automation. Robots. Space. It's all here in Las Vegas, and we're bringing it to you live.

Recent updates

You’ve landed on re:Mars

We’re here in Las Vegas for re:MARS, meeting the people building the future. Over the next few days we’ll be bringing you updates from the event—from keynotes, to interviews from the show floor, to demos of the coolest technologies changing our world, the universe, and beyond.

Register for the online event here to watch live keynotes. Follow this blog for all the action from behind the scenes.

Share

The robots have arrived.

An image of shipping crates that say "Boston Dynamics" on them.

Share

So have the astronauts.

An image of a space suit being modeled by a mannequin

Share

And whatever this amazing thing is. We'll get to the bottom of it.

A GIF of floating orbs

Share

We ran into Spot, the agile mobile robot from Boston Dynamics, on the show floor.

Meet Spot, the agile mobile robot from Boston Dynamics at re:Mars 2022

Share

The first keynote goes live at 4:30 p.m. PST

If rocketry, quantum computing, or leveraging space-based platforms to combat climate change interest you, you won't want to miss our kickoff keynote. Register for free to watch.

An image of the re:MARS keynote stage.

Share

10 years of Amazon robotics

How robots help sort packages, move product, and improve safety. Learn more about the different types of robotic systems in our facilities around the world, including sort centers and air hubs.

An image of small floor robots moving around a showcase on the showroom floor at re:MARS.

Share

Alexa, start re:MARS

The future for this fragile planet will be built in outer space.

Space travel, physics, and data analysis combined in the first re:MARS keynote in Las Vegas. Some quick lessons from the collection of physicists, engineers, and business leaders included:

  • The impact of machine learning on business and culture is just beginning.
  • If you have ever dreamt of a home in near-earth orbit, your dream is about to happen.
  • Embrace the ambiguity, quantum physics does.
  • Transportation needs to be reimagined for selling rides, not cars.
  • Dinosaurs went extinct because they didn’t have a space program, but that doesn’t mean an easy ride for humanity.
Tom Taylor, SVP, Amazon Alexa

Tom Taylor, SVP, Amazon Alexa

What do the printing press, electricity, the internal combustion engine, and machine learning have in common? They are all revolutionary technologies of their time, and their impact on society was, and will be, enormous.

Adam Savage, Maker, Editor-in-Chief of Tested.com

Adam Savage, maker, editor-in-chief of Tested.com

For Savage, a sudden moment of understanding brings a rush of endorphins. He’s always looking for that feeling. He told the audience that leaving with more questions than they arrived with is what this week of re:MARS is all about.

Brent Sherwood, SVP Advanced Development Programs, Blue Origin

Brent Sherwood, SVP Advanced Development Programs, Blue Origin

Space is our future, Brent Sherwood told the audience. How would our world be different if 100,000 people went to the moon every year? Ultimately, Sherwood sees millions of people traveling into space for new experiences, to build new businesses, and even new homes. It’s closer than you might think.

Shohini Ghose, Physicist, Founding Director, Laurier Centre for Women in Science

Shohini Ghose, physicist, founding director, Laurier Centre for Women in Science

Here is a head scratcher, how quantum is quantum? Ghose guided the audience through the promise of quantum computing, from how it’s helping us understand the fundamental laws of nature, to protecting people against quantum hacking. In the future, Ghose said, for a quantum hacker to crack quantum security they would need to violate the laws of physics. And so far, that has proven impossible to pull off.

James Crawford, Founder, Chairman, and CTO, Orbital Insight

James Crawford, founder, chairman, and CTO, Orbital Insight

If you wanted to look at all the satellite data available, you would need 8 million people analyzing all day, every day. Crawford and his company want to use data and AI to give people and companies insight from all that data into the things you can’t see.

Jesse Levinson, CTO, Zoox

Jesse Levinson, CTO, Zoox

According to Levinson, autonomous vehicles are a once-in-a-lifetime opportunity to transform mobility. But making them safe enough to carry people—not just pizzas—is a truly difficult problem to solve. With a whole lot of smart sensors, data, and testing Levinson and the Zoox team are getting close.

Michio Kaku, Author, Science TV Correspondent and Radio Host

Michio Kaku, author, Science TV correspondent, and radio host

The next time you get kidnapped by aliens, make sure you steal something. There’s no current earthly law against it, and the rest of us need proof that intelligent life in outer space exists. Could be anything, an alien paperclip, or a time-bending dish towel.

Share

Today's keynote starts at 9:00 a.m. PST

How can machine learning disrupt a 2,800-year-old routine? Will robot helpers soon roam factory floors with humans? Today's keynote answers these questions and more. Register for free to watch.

Share

What is ambient intelligence?

An image of a drive way with an orange car parked on it. There is a street sign in the foreground that says "Smart Home" on one street and "Alexa Ave." on the other.

If you haven’t already heard the term "ambient intelligence," chances are you’re already aware of one its most recognizable examples: Amazon Alexa.

Today in his keynote, Alexa AI senior vice president and head scientist Rohit Prasad said ambient intelligence is artificial intelligence that is everywhere around us—responding to requests and anticipating our needs, but fading into the background when we don’t need it.

Take Alexa, which is both reactive, responding to explicit customer requests, and also proactive, anticipating customer needs. Customers interact with Alexa billions of times each week. More than 30% of smart-home interactions are initiated by Alexa thanks to predictive and proactive features like Hunches (where Alexa alerts you when a connected smart home device isn't in its usual state, for example if a smart light is still on when you say “good night”) and Routines (where Alexa saves you time by grouping together a bunch of actions so you don’t have to ask for each one individually).

Read Prasad’s blog to learn what makes Alexa one of the most complex applications of AI in the world, how Amazon has made Alexa far more knowledgeable over the years, and how he and his team are working to take Alexa’s question-answering capabilities to the next level through "conversational exploration."

Share

Your garden is getting smarter.

An image of a man reading a book while sitting on his couch. In front of him is a table with a lit-up planter sitting atop the table.

Plants make people happier. That’s according to ēdn, a New York-based startup that wants to help more people grow them indoors. ēdn uses the cloud to power its indoor smart gardening products to provide plants with light, water, nutrients, and heat when they need it. Connecting Internet of Things devices like ēdn’s SmallGarden box to the cloud often requires developers to add tens of thousands of lines of new code to the device processor—something that demands specialized skills, especially when merging that code with their application.

Instead, ēdn has been able to use AWS IoT ExpressLink—software that powers connectivity modules (components that minimize the amount of expertise needed to wirelessly enable a range of applications) from AWS Partners that makes it easier to connect devices securely to the cloud. This has helped ēdn accelerate product development, maintain security, and allow its engineers to stay focused on developing technologies that make nature accessible indoors. AWS just announced it’s making AWS IoT ExpressLink widely available from today. Find out more in this blog.

Share

Using AI to make in-store shopping easier, faster (and more fun)

An image from the Just Walk Out showcase on the show floor at re:MARS 2022.
The Just Walk Out display on the show floor.
An image of the palm scanning technology that allows customers to enter and exit a store while purchasing their items without a cashier.
Amazon One technology in action.

Dilip Kumar, Amazon Vice President, Physical Retail and Technology, spoke to the re:Mars audience about how his team is using computer vision and machine learning to deliver easier and faster in-store shopping experiences for customers in physical spaces—from checkout-free shopping with Just Walk Out technology, to Amazon One (a fast, convenient, contactless way for people to use their palm to enter, identify and pay at a store or location), and Amazon Dash Cart (a smart shopping cart that allows customers to skip the checkout line).

He also discussed how the technology behind Amazon Style, our first-ever physical apparel store, offers a personalized and convenient shopping experience that makes it easy for customers to find styles they love. “We went to great lengths to keep the fun in shopping, while also elevating the experience through machine learning algorithms,” said Kumar.

Read his blog post and watch the video to find out more, including how the team built all these technologies with security top of mind, how they use synthetic data to train algorithms, and how they continue to innovate while scaling services.

Share

“Alexa: contact mission control”

An image of the Teacher of the Year award winners checking out the mockup space shuttle at the re:MARS 2022 convention.
An image of a man staring at a screen during a demo of a mock space shuttle.
An image of a woman taking a selfie photo in front of the space shuttle mockup at re:MARS 2022.

When NASA returns to the moon for the first time in over 50 years with the Artemis I mission, Alexa will be onboard to provide voice interaction between the Artemis-1 spacecraft and mission control at Johnson Space Center in Houston, Texas. The Artemis mission will land the first woman and first person of color on the moon, establish the first long-term human presence on the lunar surface, and will use the moon as the stepping stone for our next greatest leap—human exploration of Mars.

Expected to launch in fall of 2022, Alexa is joining the un-crewed test flight—the first of three stages in the mission—as part of the Callisto payload, an industry-funded technology demonstration embedded into NASA’s Orion spacecraft and built in collaboration with engineers from Amazon, Cisco, and Lockheed Martin.

Here at re:MARS, there’s a full-size mockup of the space craft’s crew cabin, a copy of the Alexa system that will be part of the first Artemis mission. We’ve been trying out the menu of commands, interacting with Alexa in the same way astronauts hope to do on future missions to the moon and beyond.

Share

Dancing with robots

The next phase of robotics includes machines that move in partnership with people, and AI that can do more than turn the lights on and off—it will offer companionship too.

What is constantly amazing is how the tasks we breeze through daily—shopping in a grocery store, placing an item on a shelf, texting a friend—are far more complex and subtle than we imagine when broken down into a challenge for a machine to tackle. That doesn’t mean the problems aren’t solvable, it’s just really hard.

The Day 2 Keynote at re:MARS dug into the subtleties of building robots that can work with us. It explored the idea of generalized intelligence in AI, where our Alexa-enabled devices explore, learn, and act like we do. We were taken to space, and offered a glimpse of the structures that in just a few years will be the homes and workplaces of the first wave of people living off our planet. And we looked into our inner-space—the human brain—and how thoughts can literally be turned into actions.

To watch the full keynote, register and watch here.

A taste of what we learned from the technologists, futurists, and entrepreneurs assembled follows.

Tye Brady, chief technologist, Amazon Robotics

Tye Brady, Chief Technologist, Amazon Robotics

We are in the early days of robotics, akin to those early barnstorming days of flight. The comparison is apt, because the impact of robots will be as great as the airplane.

Dilip Kumar, VP Physical Retail and Technology, Amazon

Dilip Kumar, VP Physical Retail and Technology, Amazon

It may seem obvious, but something we don’t like—whatever the context—is waiting in line. Removing those lines however, is a monumental task, something that requires machine learning every step of the way.

Rohit Prasad, SVP and head scientist, Alexa AI, Amazon

Rohit Prasad, SVP and Head Scientist, Alexa AI, Amazon

Alexa is learning, and not just to respond to prompts and questions, but to offer more and more proactive help. The idea is that Alexa can not only offer a friendly reminder when you leave the back door open, but also just be a friend.

Ariel Ekblaw, director, MIT Space Exploration Initiative

Ariel Ekblaw, Director, MIT Space Exploration Initiative

The moon is going to be busy in the next few years. There are multiple efforts by governments and commercial outfits to live and work on the lunar surface. And this time, as they say at MIT, we are going to the moon to stay.

Thomas Oxley, CEO, Synchron, Inc.

Thomas Oxley, CEO, Synchron, Inc.

For those who have lost the ability to communicate by voice or movement, one of the top requests they have is finding a way to simply text-message with loved ones and friends. They are on the verge of getting that wish fulfilled.

Justin Cyrus, CEO, Lunar Outpost

Justin Cyrus, CEO, Lunar Outpost

To better sustain life on this planet, we will need to turn to the resources on other planets. The moon is the first stop in this commercial endeavor, and Mars will be next.

Share

The final keynote kicks off at 9 a.m. PST

Get inspired by cutting-edge developments led by AWS and the real-world applications customers are building to redefine their industries. Register for free to watch.

Share

The out-of-this-world teachers bringing science to life for their kids

An image of eight Amazon Future Engineer Teacher of the Year award winners smiling for a photo in front of the mock space shuttle at re:MARS 2022 in Las Vegas.

Eight of the ten winners of the Amazon Future Engineer Teacher of the Year award are here at re:Mars—checking out the exhibits and the tech showcase for ideas to take back to their classrooms. Each award-winner received a prize package valued at more than $30,000, including $25,000 to expand computer science and/or robotics education at their school. Find out more about how this talented team is inspiring the leaders of the future in this blog.

Share

Putting a Snowcone in space

An image of the AWS Snowcone device floating in a space ship. There is earth in the view out the window in the background.

What happened when Axiom Space and AWS teamed up to find a way to help astronauts manage data generated onboard the International Space Station (ISS)?

We took the AWS Snowcone Solid State Drives (SSD)—a powerful edge computing device that provides data storage and processing capabilities in places where there’s limited or no connectivity—and in just seven months worked with Axiom and NASA to prepare to send it to the ISS.

As AWS VP, Engineering, Bill Vass explained in his keynote this morning, although the Snowcone was designed for rugged, mobile disconnected environments, it was not originally intended for use in space. Getting it ISS-ready meant subjecting it to NASA’s rigorous safety review process, from detailed thermal analysis to a series of laboratory tests simulating the random vibrations of a rocket during launch, and a spacecraft in flight.

The result? A Snowcone successfully installed on the ISS, enabling teams to quickly process, tag, and analyze photographs of onboard research experiments, and a major milestone for Axiom Mission 1, the first all-private mission to the ISS. Find out more about how AWS and Axiom pushed the limits to provide cloud computing capabilities in space.

Share

Testing out wearable robotics

A close up image of legs walking on a treadmill with a device on the right leg.
An image of a man walking on a treadmill with devices hooked up to him.

Yves Nazon and Riley Pieper are two researchers from the Neurobionics Lab at the University of Michigan, and they're here at re:MARs to talk about their work developing ankle exoskeletons. Nazon got on the treadmill to show us how these can be programmed to provide people with assistance when running, or walking. The pair are investigating how to incorporate user preference into the development of wearable robotics, for example—whether people prefer more or less assistance when walking uphill. The real-world application? These exoskeletons could provide support to people with very manual or physically demanding jobs, or those with limited mobility.

Share

New services make coding and building ML models easier

Swami Sivasubramanian smiling while standing on the keynote stage at re:MARS. Behind him is a colorful screen that says "Amazon re:MARS"

There was good news for developers, students, data scientists, experienced professionals, and just about anyone building ML models or wanting to be a more productive coder, in Swami Sivasubramanian’s keynote at re:MARS this morning.

AWS’s VP, Data and ML Services announced that Amazon SageMaker Ground Truth now supports synthetic data generation. What does this mean? Well, if you want to build an ML model, you need to start by collecting large, diverse, and accurately labeled datasets, something that can be challenging and time-consuming. A data scientist, for example, might spend months collecting hundreds of thousands of images to train a computer vision model. Once they’ve done that, they would need to manually label the images, a process that’s slow and open to human error. One way to overcome this is by combining real world data with so-called synthetic data—data that’s created by simple rules, statistical models, computer simulations, or other techniques. It means you can "order" the exact data use case you are training your model for—adding variety that real-world data might lack, which in turn helps to create more complete and balanced data sets. Find out more in this blog.

Sivasubramanian also revealed that a new service, Amazon Codewhisperer, is now available in preview. Designed to help people write better code and be more productive, while reducing routine and repetitive work, it’s trained on billions of lines of code and powered by ML. Learn more about Codewhisperer and how to join the preview in this blog.

Share

Remember, you're in the desert. Stay hydrated.

An image of a robot holding a water bottle. There is a sign that says "UNLV" in the background.

Share

Unidentified flying objects

An image of a man on a Weel bike. He has his hands up and is riding with no hands.
The Weel team took their self-balancing, impossibly cool looking electric bike for a spin under the glow of the light installation.

Remember those mysterious floating orbs we discovered on Day 1? After some investigation, we discovered that the installation was created by Hiro Seki, senior art director at AWS. Seki's goal was to "give the audience an art experience where the tremendous depth of the re:MARS entry hall could be felt through the undulating light sequences." It’s calming and cosmic. And we could sit under it all day.

Share

Astro charmed crowds with beat boxing, animal noises, and dancing.

Astro beat boxes at re:MARS 2022

Share

"They get excited about everything"

Amazon Future Engineer Teacher of the Year winners visit re:MARS

Amazon Future Engineer Teacher of the Year award winners told us what their students love learning about, why teaching is so rewarding, and how the prize is helping them fund activities to encourage young people to pursue careers in computer science and robotics.

Share

Space is a participation sport

Amazon re:MARS 2022 - Day 3 - Keynote

Flying suits, snowballs in space, and the power of simulation.

There are a few consistent themes that have come out of re:MARS. Space is no longer confined to government-funded organizations, but has opened up to, and is being pursued by, more and more commercial enterprises. As Tom Soderstrom, director of chief technologists, AWS Public Sector, said in the event’s closing keynote, “Space is a participation sport.”

Another major theme is that machine learning (ML) is playing an increasing role in how problems are being solved across every industry, and all realms of academia. From understanding how “super” solar wind storms are formed, to developing digital twins of factories, helping developers write secure and unbiased code, and designing robots that can help human colleagues, ML is the key.

Running through everything at re:MARS, on the keynote stage and in the tech showcase demonstrations, was also an inspiring show of imagination and a willingness to fail. No one builds business parks in space, robots that can run on four legs, or a smart cane that can help people who have sight impairments without incredible imagination. That is paired with an understanding that it’s probably not going to work the first, or even fiftieth time around. Failure is part of the equation of creation, or as the founder and chief test pilot at flying jet suit company Gravity Industries liked to call it when he fell out of the sky from time to time in the early days a willingness to “iterate.”

Register for free to watch the full keynote.

And if you need any more encouragement, some highlights from the closing re:MARS keynote follow.

Swami Sivasubramanian, VP, Data and ML Services, AWS

“Graph Neural Nets are the next big thing,” Sivasubramanian said. So, now you know. And if you don’t know what a GNN is, they start with a graph. A graph consists of nodes—typically represented by circles—and edges—typically represented as line segments between nodes. In a knowledge graph, for instance, the nodes represent entities, and the edges represent relationships between them. In a social graph, the nodes represent people, and an edge indicates that two of those people know each other.

GNNs extend the performance benefits of deep learning models (an ML technique) to graph all that data. Like other popular neural networks, a GNN model has a series of layers, which progress toward higher levels of abstraction.

Roni Sole, software development manager, AWS

Building machine learning models requires training data, and often that means images. But finding and accurately labeling the number of images required is slow and often expensive. But what if you could create your own images? That is what Sole announced in her segment—synthetic data generation for Amazon SageMaker Ground Truth. Using the tool, from a 3D model, you can create labeled synthetic 3D images within hours to train your ML model.

Bill Vass, VP, Engineering, AWS

Vass showed cloud computing, way out on the edge, running in the International Space Station. He then turned to the power of simulation, and had a message for those still debating the use of simulation in their business. “If your business is not leveraging simulation,” Vass said, “To your competition you are basically standing still.”

Tom Soderstrom, director of chief technologists, AWS Public Sector

With all the activity in space, both near earth orbit, and missions to the moon, Soderstrom sees a trend he’s labeled the “Internet of space things.” Technologies that communicate and interact with each other, with space travelers, and with the rest of us down on Earth.

Nicola Fox, heliophysics division director, NASA

“Heliophysics is basically the study of the sun,” Fox said. Why that matters is the sun means everything to us on Earth. It sustains us, but what Fox calls “solar super storms” can also take down power grids across the planet. Using machine learning models to analyze what data we have, Fox and her colleagues are looking for the clues to what triggers the most potentially damaging solar storms, how we can both predict the arrival of these events, and ultimately safeguard against their impact.

Richard Browning, founder and chief Test Pilot, Gravity Industries

What Browning calls his “ludicrous idea” is now a full-blown and spectacular human, jet-powered flight suit capable of deploying in military, search and rescue, and just plain fun mission. But it certainly wasn’t easy to build, or obvious that Browning could even pull it off in the early days. Browning persevered through the failed test flights, but always with this thought in mind. Failure is inevitable when you are taking great imaginative and technological leaps, but you need to ask yourself, “If failure occurs can you get back up again?” Browning said. “From a safety perspective, but also reputationally and financially?

Share

That's a wrap

Highlights from re:MARS 2022

That's a wrap on an amazing week diving into Machine learning, Automation, Robots, and Space. We saw and learned so much. Here are just a few of the highlights.

Share

Signing off

An-image-of-various-robots-lined-up-for-a-photo-at-reMARS-2022.

From our robot family to yours, thanks for joining in re:MARS 2022. See you at the next one.

Share
Back to Amazon