{"id":9003,"date":"2020-07-22T07:59:58","date_gmt":"2020-07-22T07:59:58","guid":{"rendered":"https:\/\/www.experfy.com\/blog\/?p=9003"},"modified":"2023-11-27T16:07:33","modified_gmt":"2023-11-27T16:07:33","slug":"the-heart-of-a-robot-computer-vision-ai","status":"publish","type":"post","link":"https:\/\/www.experfy.com\/blog\/ai-ml\/the-heart-of-a-robot-computer-vision-ai\/","title":{"rendered":"The Heart of a Robot? Computer Vision &#038; AI"},"content":{"rendered":"\t\t<div data-elementor-type=\"wp-post\" data-elementor-id=\"9003\" class=\"elementor elementor-9003\" data-elementor-post-type=\"post\">\n\t\t\t\t\t\t<section class=\"has_eae_slider elementor-section elementor-top-section elementor-element elementor-element-237869de elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"237869de\" data-element_type=\"section\" data-e-type=\"section\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"has_eae_slider elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-7aabb2b8\" data-id=\"7aabb2b8\" data-element_type=\"column\" data-e-type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-33ee1dc elementor-widget elementor-widget-text-editor\" data-id=\"33ee1dc\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Robots have long been a staple of science fiction stories. Robby the Robot, with a distinct personality and a dry wit, first appeared in the 1956 movie\u00a0<em>Forbidden Planet<\/em>. And in the U.S. television series\u00a0<em>Lost in Space<\/em>, the Model B9 robot had both superhuman strength and musical talents.<\/p>\n<!-- \/wp:paragraph -->\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-38868e9 elementor-widget elementor-widget-text-editor\" data-id=\"38868e9\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Of course, we are still a long way from fully autonomous robots that offer personality, protection, and companionship\u2014on this planet, let alone in the farthest reaches of space. But we are starting to see glimpses of intelligent robots in a wide range of everyday applications:<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:list -->\n<ul>\n<li>Greeting customers, answering questions, and guiding retail shoppers<\/li>\n<li>Providing information about hospital facilities and guidance on continuing patient care<\/li>\n<li>Receiving guests, guiding them to reception, and transporting luggage to hotel rooms<\/li>\n<li>Accepting payments and gathering account information in banking centers<\/li>\n<li>Shuttling goods around warehouses and serving as after-hours security guards<\/li>\n<\/ul>\n<!-- \/wp:list -->\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-9f1c26a elementor-widget elementor-widget-heading\" data-id=\"9f1c26a\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\"><h2>The Anatomy of a Smart Service Robot<\/h2><\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-8b0c745 elementor-widget elementor-widget-text-editor\" data-id=\"8b0c745\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<!-- wp:paragraph -->\n<p><strong>Video 1<\/strong>\u00a0shows Smart Service and Delivery Robots from\u00a0<a href=\"https:\/\/marketplace.intel.com\/s\/partner\/a5S3b0000016OT5EAM\/new-era-ai-robotic-inc?language=en_US\" target=\"_blank\" rel=\"noreferrer noopener\">New Era AI Robotic Inc<\/a>. The systems use simultaneous localization and mapping (SLAM) algorithms, voice and facial recognition software, and a comprehensive sensor suite to carry out the tasks mentioned above.<\/p>\n<!-- \/wp:paragraph -->\n\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-f3dd89c elementor-widget elementor-widget-video\" data-id=\"f3dd89c\" data-element_type=\"widget\" data-e-type=\"widget\" data-settings=\"{&quot;youtube_url&quot;:&quot;https:\\\/\\\/www.youtube.com\\\/embed\\\/v6w_Jj3dfD4?feature=oembed&quot;,&quot;video_type&quot;:&quot;youtube&quot;,&quot;controls&quot;:&quot;yes&quot;}\" data-widget_type=\"video.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-wrapper elementor-open-inline\">\n\t\t\t<div class=\"elementor-video\"><\/div>\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-1ecd1ec elementor-widget elementor-widget-text-editor\" data-id=\"1ecd1ec\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<!-- wp:paragraph {\"align\":\"center\",\"style\":{\"typography\":{\"fontSize\":11}}} -->\n<p class=\"has-text-align-center\" style=\"font-size: 11px;\">Video 1. Intelligent service and delivery robots are used as assistants in several industries. (Source:\u00a0<a href=\"https:\/\/www.youtube.com\/watch?v=v6w_Jj3dfD4&amp;t=2s\" target=\"_blank\" rel=\"noreferrer noopener\">New Era AI Robotic<\/a>)<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>These capabilities are executed on two separate subsystems: one for navigation and control, and the other to drive user interfaces.<\/p>\n<!-- \/wp:paragraph -->\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-43606f5 elementor-widget elementor-widget-heading\" data-id=\"43606f5\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\"><h2>At the Core: Computer Vision and Deep Learning<\/h2>\n<!-- \/wp:heading --><\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-39da96c elementor-widget elementor-widget-text-editor\" data-id=\"39da96c\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<!-- wp:paragraph -->\n<p>New Era\u2019s in-house SLAM technology is at the heart of its robots, allowing the 40- to 50-kg systems to safely navigate surroundings. The deterministic, control-oriented SLAM software runs against input data from multiple sensors to give robots a 2D\/3D view of their surroundings for object detection, recognition, and avoidance.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>\u201cAutonomous cars have many, many sensors,\u201d said Allen Tsai, chief engineer of SLAM software at New Era AI. \u201cLikewise, indoor robots can\u2019t just rely on one sensor. In real-world environments like shopping malls where there are a lot of people, nothing is static.\u201d<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>Initially, the systems leveraged just a 2D planar LiDAR sensor array. Although this LiDAR is cost-effective and reliable, it proved limiting for robots navigating dynamic three-dimensional spaces. By adding an Intel<sup>\u00ae<\/sup>\u00a0RealSense<sup>\u2122<\/sup>\u00a0camera to the design, New Era implemented stereo vision for better perception of angles, corners, and more (<strong>Figure 1<\/strong>).<\/p>\n<!-- \/wp:paragraph -->\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-9bf4d70 elementor-widget elementor-widget-image\" data-id=\"9bf4d70\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img decoding=\"async\" src=\"https:\/\/insighttech.intel.com\/wp-content\/uploads\/sites\/45\/2020\/01\/AI-computer-vision-machine-learning-facial-recognition-1-600x400.jpg\" alt=\"\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-d3fcb14 elementor-widget elementor-widget-text-editor\" data-id=\"d3fcb14\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\tAI-computer-vision-machine-learning-facial-recognition-1-600&#215;400.jpg&#8221; alt=&#8221;The Intel\u00ae RealSense\u2122 camera provides depth perception and angular information. (Source: Digital Trends) The Heart of a Robot? Computer Vision &amp; AI&#8221; \/>\n<figcaption>Figure 1. The Intel<sup>\u00ae<\/sup>\u00a0RealSense<sup>\u2122<\/sup>\u00a0camera provides depth perception and angular information. (Source:\u00a0<a href=\"https:\/\/www.digitaltrends.com\/computing\/intel-realsense-review\/\" target=\"_blank\" rel=\"noreferrer noopener\">Digital Trends<\/a>)\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-07fe9db elementor-widget elementor-widget-text-editor\" data-id=\"07fe9db\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<!-- wp:paragraph -->\n<p>\u201cWith Intel RealSense, we are able to use classic <a href=\"https:\/\/www.experfy.com\/blog\/computer-vision-why-its-hard-to-compare-ai-and-human-perception\/\" target=\"_blank\" rel=\"noreferrer noopener\">computer vision<\/a> algorithms to enhance images and identify features,\u201d Tsai continued. \u201cAnd then we infuse that with our LiDAR sensor so we\u2019re not dependent on just one sensor.\u201d<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>A quad-core Intel<sup>\u00ae<\/sup>\u00a0Core<sup>\u2122<\/sup>\u00a0i5-based Linux PC processes sensor data from the LiDAR array and RealSense camera, then applies the SLAM algorithms to these inputs. These algorithms map out the physical space that a robot interacts with down to 5-centimeter accuracy. The software then overlays descriptors that identify characteristics like rooms, corridors, objects, etc.<\/p>\n<!-- \/wp:paragraph -->\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-b252876 elementor-widget elementor-widget-text-editor\" data-id=\"b252876\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<!-- wp:paragraph -->\n<p>The SLAM algorithms are extremely memory efficient, allowing thousands of maps to be stored on a robot\u2019s hard drive at any given time. As a result, each robot requires only 4 GB of DDR4 memory.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>The SLAM algorithms are extremely memory efficient, allowing thousands of maps to be stored on a robot\u2019s harddrive at any given time. As a result, each robot requires only 4 GB of\u00a0DDR4 memory.<\/p>\n<!-- \/wp:paragraph -->\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-6f086ed elementor-widget elementor-widget-heading\" data-id=\"6f086ed\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\"><!-- wp:heading -->\n<h2>Human Interaction with Facial Recognition and AI<\/h2><!-- \/wp:heading --><\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-a722f6f elementor-widget elementor-widget-text-editor\" data-id=\"a722f6f\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<!-- wp:paragraph -->\n<p>The second compute subsystem runs all of the applications necessary for interaction with humans, including\u00a0<a href=\"https:\/\/www.insight.tech\/cities\/is-facial-recognition-ai-s-first-killer-app\" class=\"broken_link\" rel=\"noopener\">facial recognition<\/a>, voice detection, chatbots, and a touchscreen UI. It is based on a Windows PC that leverages a quad-core Intel<sup>\u00ae<\/sup>\u00a0Pentium<sup>\u00ae<\/sup>\u00a0N4200 CPU and runs convolutional neural network (CNN) algorithms developed using the Intel<sup>\u00ae<\/sup>\u00a0OpenVINO<sup>\u2122<\/sup>\u00a0Toolkit (<strong>Video 2<\/strong>).<\/p>\n<!-- \/wp:paragraph -->\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-d3d50c3 elementor-widget elementor-widget-video\" data-id=\"d3d50c3\" data-element_type=\"widget\" data-e-type=\"widget\" data-settings=\"{&quot;youtube_url&quot;:&quot;https:\\\/\\\/www.youtube.com\\\/embed\\\/238KPQUgQxI?feature=oembed&quot;,&quot;video_type&quot;:&quot;youtube&quot;,&quot;controls&quot;:&quot;yes&quot;}\" data-widget_type=\"video.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-wrapper elementor-open-inline\">\n\t\t\t<div class=\"elementor-video\"><\/div>\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-9f50c3d elementor-widget elementor-widget-text-editor\" data-id=\"9f50c3d\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<!-- wp:paragraph {\"align\":\"center\",\"style\":{\"typography\":{\"fontSize\":11}}} -->\n<p class=\"has-text-align-center\" style=\"font-size: 11px;\">Video 2. Robots use the Intel<sup>\u00ae<\/sup>\u00a0OpenVINO<sup>\u2122<\/sup>\u00a0Toolkit algorithms to detect human faces and emotions. (Source:\u00a0<a href=\"https:\/\/www.youtube.com\/watch?v=238KPQUgQxI\" target=\"_blank\" rel=\"noreferrer noopener\">Omar Lam Demonstration<\/a>)<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>OpenVINO helped New Era AI engineers optimize algorithms for execution on the the Pentium processor, which contains an integrated Intel<sup>\u00ae<\/sup>\u00a0HD Graphics 505 GPU. This delivers enough throughput for images captured by the RealSense camera to be processed in real time. It also opens up a range of critical facial recognition functions.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>The OpenVINO-optimized algorithms not only help the robots detect humans, they are even used to analyze age, gender, and emotion. With this information\u2014collected as anonymized metadata\u2014robot operators can determine what demographic is most likely to interact with the robot, where, and for how long. In a retail or hospitality setting, for instance, these analytics can be used to maximize sales or improve customer service.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>And thanks to local connectivity provided by the Windows PC, new algorithms, chatbots, and other software can be updated over time.<\/p>\n<!-- \/wp:paragraph -->\n\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-b900e3e elementor-widget elementor-widget-heading\" data-id=\"b900e3e\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\"><h2>More Realistic Robots<\/h2><\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-5cbf28f elementor-widget elementor-widget-text-editor\" data-id=\"5cbf28f\" data-element_type=\"widget\" data-e-type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<!-- wp:paragraph -->\n<p>The engineers at New Era AI Robotic continue integrating technologies that will make interacting with their robot platforms a more natural, human-like experience.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>For instance, next-generation designs may leverage Intel<sup>\u00ae<\/sup>\u00a0Movidius<sup>\u2122<\/sup>\u00a0vision processing units (VPUs) and\/or the Intel<sup>\u00ae<\/sup>\u00a0Neural Compute Stick, in conjunction with more advanced OpenVINO algorithms. This technology stack could have significant implications for the platform, enabling simultaneous multi-person communications, localized natural language processing (NLP), and even improved image throughput and resolution for more granular mapping and navigation.<\/p>\n<!-- \/wp:paragraph -->\n\n<!-- wp:paragraph -->\n<p>While intelligent robots are not yet capable of being intergalactic companions, they are leaps and bounds ahead of anything that was available just a few short years ago. They also offer a glimpse of the integrated human\/robot society we can look forward to in the years and decades to come.<\/p>\n<!-- \/wp:paragraph -->\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<\/div>\n\t\t","protected":false},"excerpt":{"rendered":"<p>While intelligent robots are not yet capable of being intergalactic companions, they are leaps and bounds ahead of anything that was available just a few short years ago. They also offer a glimpse of the integrated human\/robot society we can look forward to in the years and decades to come.<\/p>\n","protected":false},"author":760,"featured_media":9004,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"content-type":"","footnotes":""},"categories":[183],"tags":[226,487,411],"ppma_author":[3615],"class_list":["post-9003","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-ai-ml","tag-ai","tag-computer-vision","tag-robots"],"authors":[{"term_id":3615,"user_id":760,"is_guest":0,"slug":"clive-max-maxfield-2-2","display_name":"Clive Maxfield","avatar_url":"https:\/\/www.experfy.com\/blog\/wp-content\/uploads\/2020\/04\/medium_d96bfe3c-1e79-40e7-98cd-26c9c80fb5b5-150x150.jpg","user_url":"https:\/\/www.clivemaxfield.com\/coolbeans\/","last_name":"Maxfield","first_name":"Clive","job_title":"","description":"Clive (Max) Maxfield is Collector and Communicator of Technological Information at Maxfield High-Tech Consulting. He wrote books on Electronics, Computing, FPGAs, Mathematics, and 3D Graphics. In the past, he had acted as Contributing Editor, Editor, and Editor-in-Chief at a variety of publications, including EETimes.com, Embedded.com, and EEWeb.com."}],"_links":{"self":[{"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/posts\/9003","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/users\/760"}],"replies":[{"embeddable":true,"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/comments?post=9003"}],"version-history":[{"count":7,"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/posts\/9003\/revisions"}],"predecessor-version":[{"id":34404,"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/posts\/9003\/revisions\/34404"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/media\/9004"}],"wp:attachment":[{"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/media?parent=9003"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/categories?post=9003"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/tags?post=9003"},{"taxonomy":"author","embeddable":true,"href":"https:\/\/www.experfy.com\/blog\/wp-json\/wp\/v2\/ppma_author?post=9003"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}