Ryan Hickman, who co-founded the Cloud Robotics group at Google and was an early part of the Toyota Research Institute Product team, describes how his startup tried to make consumer home robots work
This is a guest post. The views expressed here are solely those of the authors and do not represent positions of IEEE Spectrum or the IEEE.
We started TickTock in March 2017, knowing that robotics was about to have a breakthrough, and it was going to start with mobility.
Soohyun Bae and I both met at Google years earlier, and had worked on augmented reality products. Soohyun went on to Magic Leap, and I helped launch Project Tango, now called AR Core. We both knew that AR’s push for mobile 3D mapping and scene understanding was causing a dramatic tech shift that would also benefit robotics. You’ll see hints of how that tech comes into play down below as we share TickTock’s explorations into consumer robotic product opportunities.
Keep it simple
Our initial goal was to make the simplest robot possible that solved a real problem for users. First we had to set our team’s expectations for what’s technically possible and what user experiences were already available. We put up pictures of dozens of robots on the wall and looked for commonality and trends.
There were some clear categories that emerged. We quickly ruled out most:
- Toys: these products don’t pass the “30 day active” test for frequent long-term use that would allow the AI to get smarter over time
- Animatronic AI’s: it’s not clear that physical motion is warranted vs being an app for an Echo Show (which didn’t exist when social robots started)
- Elder care: there’s a huge opportunity here yet major hurdles in selling new technology to someone with mobility or cognitive challenges, not to mention potential regulatory or safety issues
- Floor care: totally saturated market with massive price pressure from China and US distribution challenges from incumbents
- STEM/Edu/Maker: we love that our kids get to play with robots in school but these short interactions don’t lead to AI learning opportunities in the way that a 24/7 active robot in the home would
A super smart robot was going to cost hundreds of dollars so we really needed a user experience with enough utility to justify the purchase. It was likely going to become a new category since nothing like it existed out there.
It had to be mobile
We were honest with ourselves about being a hammer looking for nails, in that the product had to justify being a “robot” (sense > plan > physical action > learn > repeat) and also match our individual backgrounds and passion. That’s where mobility kept shining through. Moving things, whether household items or just a suite of sensors, is something a stationary IoT gadget can’t do. We move stuff all the time in our lives, and our possessions can’t be everywhere at once, so we started with a simple point-to-point mobility robot.
The box would include two small plastic stands for commonly accessed items in the home to sit on. This would allow robots to drive under them, lift the item up, and bring it to you (or put it away).
Products like trash cans and laundry baskets could be designed with a few inches of space below them for robots to move them any time.
Could having laundry and trash receptacles nearby at all times prevent clothes and trash from landing out of place, thus preventing the need to tidy up as often? Imagine a laundry basket that knows when your kids come home from soccer practice and places itself right where their uniform would normally hit the floor.
Picture waking up on trash day and all the cans are right by the door for you to dump into the bin and take to the curb — then returning the empty cans room by room after you head off to work.
Beyond static plastic, we felt that an ecosystem of accessories might become possible thanks to a power port on the back. The Essential phone does a nice job of this with two pins and magnetic docking.
This could enable a wide host of utility that reuses the presumably expensive parts in the mobile robot base. This addresses the fact that “general purpose robotics” doesn’t jive with real world physics, and recognizes that even dexterous humans have drawers full of tools and utensils to help us get work done. Robots too will need physical affordances to interact with the world and accessories can be purpose built for specific tasks.
Who knows, maybe people just want faster access to beer? A slide-out tray for a cooler on the ground is way more practical than a robot arm trying to reach behind the milk in your main fridge.
Too much of an ecosystem play for a startup
The breadth of the above use cases was the problem though. There’s too much reliance on accessories for a startup to break through. Investors kept asking us “what’s the one accessory that most users will buy?”, and suggested we make that as a fully integrated solution.
Our user research showed strong opportunities for an “Echo Show on Wheels” device (which I talk about here), but investors knew that was Amazon’s job and we had to be different. The accessory that we found most traction with was in cleaning. It’s a daily task and a pain to constantly fetch supplies, especially for families with kids making messes five times a day.
The Slider use case for cleaning started with the idea that we’d just go fetch a broom and dustpan. Users reacted with “ hey, it’s not the 1950’s, I need my Dyson”. So we looked at transporting a handvac (whether from Bissel, Dyson, or Hoover), but found that people keep them in closets. This kept them out of sight but also out of reach for a robot to go fetch (opening doors was out of scope).
This meant we needed something attractive enough to remain in a common part of the home, ready to make itself available when needed.
Additional user feedback suggested we add cleaning supplies so people could fully handle any mess, not just vacuum the floor. Busy families loved the idea of one-handed cleanup, especially parents that frequently have to clean up both dry crumbs and wet spilled milk, often with a baby in one arm.
Slider was M-O from Wall-E made real
Slider’s contextual AI would learn when it was needed (e.g. after 7am breakfast on weekdays) and come clean up any messes on the floor. You’d have quick access to a handvac if there were crumbs in the chair, plus wipes and a trash bin for other messes (e.g. the oatmeal my kids ALWAYS miss getting in their mouth).
The contextual relevance of knowing where and when to be was key to addressing a major complaint with current robot vacs: they aren’t there when you need them!
This brought up a new issue though. If a robot is going to be roaming around all the time, it needs to fit in among the family. There are tons of videos online where both kids and pets freak out when Roomba is active, so we needed better HRI (human robot interaction) and a friendly design.
Sir-B brings character to a utility appliance
My daughter Lily came up with the name, which started as “mister robot”. This wasn’t a vigilante hacker, so “Sir Robot” became “Sir-B”, a cute and approachable floor cleaning tidy bot.
We didn’t plan to have Sir-B talk, but it could play songs and move in fun ways to gamify cleanup activities. Toddlers respond to the Barney Cleanup song like Pavlov’s dog and instantly start cleaning things up.
You’d also get piece of mind when away from home by keeping an eye on things. Pets, people under care, or just checking if that pesky stove was left on.
Wait, isn’t robotic floor care a saturated market?
Yes, there’s so many robot vacs out there that price competition is fierce. Incumbents like iRobot also have huge market share backed by vast distribution systems. While we felt strongly that users were craving a Sir-B-like product, we’d still have Roomba on the retail endcap and our robot would be halfway down the aisle on the bottom shelf getting no attention.
Investors wanted nothing to do with it. Thankfully we had Bryan De Leon, an amazing Industrial Designer and User Experience Researcher, working with John Moretti, an experienced consumer electronics Product Manager, who both iterated rapidly on new ideas. Their user studies and market research led us to drop the vacuuming but keep the fun cleanup aspects.
Peel back everything physical beyond mobility and you’ve got a super smart toy basket and interactive cleanup pal. Imagine age-relevant games to make kids feel like they’ve got support and motivation to tidy up.
Parents in the Silicon Valley bubble loved it! Even at $700 they were thrilled at the idea of delegating some nagging to a robot that encouraged responsibility in their kids. Got teenagers who don’t listen? The robot will cut off the TV and WiFi until all out-of-place objects are where they belong.
This is where we ran into the downside of being in Silicon Valley. The bubble only extends so far and investors felt this product had no chance working across the US, let alone the globe. For that price point it had to do more!
Time to revisit the home butler robot
One of the highlights of my time with TickTock came in meeting Nolan Bushnell. He’s equally known as the founder of both Atari and Chuck E. Cheese. My very first job in high school was as “Gameroom Technician” for the Chuck E. Cheese in West Palm Beach, FL, and of course Atari got me hooked on video games. Many thanks to Vijay Sundaram for the intro!
Lesser-known is Nolan’s breakthrough home robot efforts. He’s one of the few people to have shipped millions of dollars of mobile home robots and Androbot did it long before the benefit of modern mobile processors. His passion for fun and play are infectious and we tried to embody some of that in our next concept — the TickTock Butler Robot.
The robot would let you keep an eye on things when you weren’t home. It would hold or transport commonly accessed items. Your kids might even be nudged into picking up their toys with a reward system set by the parents.
Of course we’d put the Google Assistant and/or Amazon Alexa on it as well for quick access to information in any room of the house (one robot per floor).
User experience is everything
TickTock focused a lot on UX with voice, touch, mobile, web, and augmented reality modes for interaction. It needed to explore your home like a new pet and autonomously make a map right out of the box. Unlike a slow moving floor vac, it needed to respond quickly, which required powerful vision systems to move safely at speed.
Setting up the robot and “training it” is uniquely suited for mobile augmented reality. We discovered that users didn’t require any training to simply point their phone at various locations and objects in your home to label them. From there we found that voice was the quickest way to command the robot, so we supported both Amazon’s Alexa and Google’s Assistant.
Our AR app was called ARRViz (pronounced like a pirate, “arrrr-viz” was a play on ROS’s RViz), and stands for the Augmented Reality Robot Visualizer. We found it to be super handy to see what the robot was thinking and how it viewed the world.
With an Echo Show mounted on top we got onboard voice recognition and video telepresence for free. Here we can see it roaming the TickTock offices with one team member remotely dialed into the Echo Show from his iPhone’s Alexa app.
TickTock’s home robot pursuit came to an end
It’s not clear which of the ideas above will take off first, but I suspect all of these products will exist some day. My family certainly had fun experimenting with them, even in their early unfinished state.
Timing has to be right for the tech to be mature, the price to be affordable, and users have to be willing to embrace new experiences. Augmented reality tech as used on the Project Tango Asus Zenfone AR definitely showed a viable path to autonomous navigation. Unfortunately, nearly 200 investors felt that 2017 wasn’t the right time for a startup to break through in this space on cost and market readiness, so all of these ideas had to be shelved.
TickTock pivoted to commercial robot opportunities in 2018, and I’ll share those product concepts soon. We’ll still be cheering on the consumer robot sidelines for Keecker, Kuri, Misty, Temi, and whatever Amazon is cooking up to succeed!
Ryan Hickman (@ryanmhickman) started at Google in 2007, and co-founded Google’s Cloud Robotics group in 2009. When Google acquired Motorola, Ryan worked on Project Tango within Google’s Advanced Technologies and Projects group, and most recently was supporting an effort to explore new hardware device opportunities for kids and families. In 2016, Ryan joined the Toyota Research Institute as an early member of its Product team. Ryan cofounded TickTock AI in February of 2017 to apply a new approach to AI for mobile apps and physical devices, and is currently the CEO. More of Ryan’s writing is available on Medium.
Inside TickTock's Consumer Robot Product Explorations
Inside TickTock's Consumer Robot Product Explorations