Tuesday, May 18, 2021

Digital Humans ‘residing’ Inside Computer Server Farms and Not Physical Robots Will Drive the Workforce of the Future.

 How fast the paradigm changes. Up until this point we have been concerned about the impact that robots would have on jobs. It was and is a valid concern. It was, however, mostly limited to the impact of robotics on jobs in in factories and warehouse operations and not about socially assistive robots. The push back on socially assistive robots by skilled workers has been formidable. But a new technology is now emerging that will impact skilled knowledge workers. It is the combination of AI and digital human avatar interfaces.

If you are a CEO, COO or CTO you simply must be actively adjusting your strategies, plans and operational designs now to this expanding technological advance. Why? Because it will radically change competitiveness, operational cost constructs and drive improvements in brand loyalty. Digital Humans across the board is landscape changing.

The combination of AI and digital human avatars delivers a powerful formula of knowledge, responsiveness, and sustained customer engagement. These are essential keys to any business success.

Is it live or is it Memorex? I am old enough to remember the advertising for a recording tape product from a company called Memorex. They would challenge listeners to determine if the vocal they were listening to was live or was it a recording. Digital Humans are getting that good. Have you explored the current models like those offered by UneeQ a clear leader in the field? Can you tell the practical difference?

With AI powering the ‘brains’ of these digital humans they are also getting super smart on a protracted basis. The more they are used the smarter they get. So, having them deployed as customer service agents, health care advisors, transaction assistants only make sense. Add the benefit that you can deploy a digital human at significantly lower cost that a real human or even a robot and the business model for adoption becomes a no brainer.

Plus, there are other easily discoverable additional advantages of digital humans.

Digital Humans are an easily scalable resource – up or down- as service volume dictate while robots are not easily scalable and are in many attempted use cases proven to be intrusive.

And with Digital Humans there are no HR problems, no medical benefits, no vacations or holidays, no limits on hours worked and no taxes to cite just a few.


Will this be your business model in 2022?

 

 


 

 

Or will this be your business model?


It is time to define a digital humans strategy for your work force and choose the technologies that you will use. It is also time also to create a Human Resources (digital human) framework and associated policies. These digital humans will need to be evaluated, monitored, trained and managed. Sounds like human resources will be instrumental in deployments in this new digital age?

We are here to advise and assist. Don’t hesitate to reach out for conversation.

See our Visioneering Workshop: “Building the Digital Human Workforce” at www.robotteca.com

Mike Radice is Chairman of the Technology Advisory for ChartaCloud. Reach Mike at info@chartacoud.com

 

Wednesday, January 27, 2021

Why ‘Digital People’? They are ‘The New Face’ of AI/AGI and will become the interface to all things digital.

 I think we can all agree that AI/AGI is transforming the digital landscape of just about everything. What these transformative AI/AGI applications will herald is the need for a smart AI interface, a human-like interface. ‘Digital People’ are human-like, digitally created AI constructs (CGI animations; avatars) that appear to be humans and that can be empowered to converse, react, and respond quite like real humans.

Digital people technology will also advance the socially assistive robotics space. And the adoption will be rapid. The current domain of humanoid like robot hardware form factors being used today for socially assistive robotics remains bogged down in search of use cases that justify their cost and clearly demonstrate utility. This is true especially in the in-home and retail store markets which remain the ‘two burial grounds’ for today’s popular humanoid robot form factor hardware. In-home and retail space is just too limited, too obstacle riddled, and the intrusion of roaming robots wandering around is just too great. Screen based AI/AGI digital people platforms will be widely adopted.

 ‘Digital people’ robotic initiatives will deliver:

Voice controlled AI/AGI portals

A sense of authentic engagement powered by AI/AGI

Increased trustworthiness of robot interaction platforms

A growing sense of knowledgeable/learning companionship

A method of interface that feels like a personal, live T.V. and escapes the ‘uncanny valley’.

 Bottom line is that deployable digital people technologies are significantly more scalable ‘in the wild’ than robots using the traditional form factor.

 I predict that ‘digital people’ will become the interface to all things digital, the new face of an AI/AGI powered world. Get ready to run your entire personal day, your home, and your business via your own personalized ‘digital person’. Perhaps we are witnessing the beginning of the end of the keyboard?

Michael Radice is the Chairman of the Technology Advisory for ChartaCloudrobotics.com and ROBOTTECA.com. You can reach Mike at info@chartacloud.com

 

 

 

 

Thursday, January 7, 2021

Here We Go. Robots Can Now Decode Human Personality Traits.

 

The ability of software to recognize faces has fast become a staple amongst a wide variety of software, security and robotic systems and technologies. Now, with the advances and power of AI (artificial neural networking, actually) we can apply diagnostic extraction technology designed to identify human personality traits from captured facial images. Essentially, profiling the personality make up of observed humans. This is what I call ‘derived facial diagnostics’. It is happening, and it is making robots smarter, more powerful and yes, more human-like. 

Long considered as a pseudo-science AI is proving that there is indeed a validity to the technique that, yes, the very construct of one’s face presents a roadmap view into DNA defined personality.  I reference a just published research report published on Nature.com: “…results demonstrate that real-life photographs taken in uncontrolled conditions can be used to predict personality traits using complex computer vision algorithms.” (1)

My purpose here is not to discuss the surrounding ethical, moral and social implications of this technology only to observe that when properly used it can provide a significant advance in the potential utility of robot-human interactions. No matter your position on these matters, history teaches that the derived benefits will surely ensure that it happens.

So how might such a technology be used in and with robots?

This technology has already advanced to a point where a facial image capture can return key personality traits in under two seconds. Quite sufficient to for the robot to gauge a path to take for continuing in the interaction or dialogue. So, whether it is making recommendations on potential products such as food choices, clothing, hotels or vacation destinations, a more informed robot can provide a more informed suggestion.

Analyzed over a series of interactions marketers can refine their offers by augmenting robot captured queries with the key personality traits of queriers. Communications via messages, advertisements and product offers powered by personality-based diagnostics, would supply savvy marketers with an advantage. No matter how small the competitive advantage, it is a knowledge-based advantage that will tip the competitive scale.

Extending robot interaction is another. Humans get bored quickly. This is a serious issue with robot engagement. If the human feels disconnected from the interaction by way of generalized robot responses, they simply walk away. However, if the human feels more deeply connected to the dialogue which can now be made personality driven, the likelihood of continued engagement length increases. Thus, given more time to interaction, means more time to sell, suggest or convey a promotional theme, message. More success.

My testing of this technology has served to substantiate its power and advancing validity. If you are a robot developer, feel free to reach out to me for a discussion about securing early preview access to this technology.

(1) Assessing the Big Five personality traits using real-life static facial images Alexander KachurEvgeny OsinDenis DavydovKonstantin Shutilov & Alexey Novokshonov 

 

Mike Radice is Chairman of the Technology Advisory at ChartaCloudRobotics.com and Robotteca.com. You can contact mike at info@chartacloud.com

Friday, November 13, 2020

8 Things That Successful Social Robot Developers Do.

Focus deeply on specific use case target solutions.

 

Too many potential adopters are led to believe that a social robot can just come out of the box and do most anything and everything. Talking to markets in broad terms like serving retail, hotels, schools and healthcare is so overused that the market(s) is numb to the pitch. “Tell me specifically, how and why I should use this or that robot and the benefits and contribution(s) it will make”. Too many robot developers rely on the serendipity of the market to identify on their own the potential and value of adopting a robot.  Post-sale the inevitably of not meeting customer expectations leads to a mismatch and dis-satisfaction.

 

2.      Invest and provide proven business and consumer ROI models.

 

Business and most personal decisions are based on benefits. Read ‘ROI’. Potential customers need to at least have a starting point to assess (model out) the financial impact and contribution the robot can provide. Even the ‘soft’ metrics for measuring the impact social robots can deliver can and should be quantified. E.g. the quality of life improvements for elders or reducing pain and anxiety in children can be established.

 

3.     Face head on providing solutions and guidelines to the regulations, liability and ethics that stifle robot adoption and deployment.

 

Recognizing that introducing a robot for any use, brings with it true concerns for potential mishaps, security or privacy breaches and ethical concerns. In some industries it might be general liability if for example a robot spills hot coffee on a customer.  In other industries, the regulations are significantly more rigid such as HIPAA privacy in health care. These issues a cannot be ignored, overlooked or left for the customer to navigate alone. Clear guidance and policies need to be prepared and be ready for customer presentation and evaluation.

 

4.      Recognize that most businesses and organizations are not staffed with the skills to evaluate and adopt robots.

 

We are still way early in the robot adoption game when it comes to having ‘knowledgeable buyers’ to work with. This means that deep buyer ‘skill capacity’ fears exist and surround the many selection and support issues regarding robot adoption. Being prepared to document the skills required, the training program resources available and the mutual obligations and commitments expected, and the daily robot management guidelines are just a few of the meaningful value-added components needed to ensure successful sales and on-going satisfaction. Experience teaches that too many robots are purchased without a clear understanding of what is ultimately required.

 

5.      Do not push robots into the public domain that simply do not yet ‘fully’ work.

 

Sadly, this issue speaks for itself.

 

6.      Put in place a customer service/support model.

 

We are still in the age of ‘robots’ in the wild’. The litany of issue is manifest. Breakdowns, failures, abuse, a robot’s confusion with the surrounding electronic and physical environment, quality of connectivity are but few. What is the plan for repair? What is the method and cost of repair? How long do repairs take? Where do repairs happen? Who handles the repair shipping preparations? Who pays for the insurance coverage of repair shipments? These are but few of the mutual obligations robot adopters need to address and make clear.

 

 

7.      Focus on partner success rather than drive to build up channel partner inventory.

 

It is hard to find a robot developer organization that understands the value of channel partners and seeks to establish a true partner relationship. First question from too many robot manufacturers is all too often: “How many of our robots will you inventory”? Second question is: “What are your forecasted sales”? Good gosh, most re-seller partners have not even been given the opportunity to test drive a robot they may have interest in before these questions are asked. Re-seller partners can be a valuable contributing resource towards all of the above noted issues. If you are not willing to invest in a partnership do not do one.

 

 

8.      Understand that in the overall that the number of robots forecasted to be roaming the streets and used in homes is untenable.

 

Reflect on the nature of the true reality of the market. It is a marketplace that is over-hyped with adoption volume forecasts. Think about it. If the number of robots forecasted to be deployed were realized there would be no room on the streets or in homes for people.

 

If you wish to successfully compete and survive re-read items 1-7 above.

 

Mike Radice is Chairman of the Technology Advisory for Robotteca.com and ChartaCloudRobotics.com. Mike can be reached at info@chartacloud.com

Monday, October 12, 2020

Here We Go –The Leap from Facial Recognition to Derived Personality Traits from Facial Image Diagnostics

It is happening, and it is making robots smarter, more powerful and yes, more human-like. The ability of software to recognize faces has fast become a staple amongst a wide variety of software, security and robotic systems and technologies. Now, with the advances and power of AI (artificial neural networking, actually) we can apply diagnostic extraction technology designed to suggest personality traits from captured facial images. Essentially, profiling the personality make up of observed humans. This is what I call ‘derived facial diagnostics’.

Long considered as a pseudo-science AI is proving that there is indeed a validity to the technique that, yes, the very construct of one’s face presents a roadmap view into personality.  I reference a just published research report published on Nature.com: “…results demonstrate that real-life photographs taken in uncontrolled conditions can be used to predict personality traits using complex computer vision algorithms.” (1)

My purpose here is not to discuss the surrounding ethical, moral and social implications of this technology only to observe that when properly used it can provide a significant advance in the potential utility of robot-human interactions. No matter your position on these matters, history teaches that the derived commercial benefits will surely ensure that it happens.

So how might such a technology be used in and with robots?

This technology has already advanced to a point where a facial image capture can return key personality traits in under two seconds. Quite sufficient to for the robot to gauge a path to take for continuing in the interaction or dialogue. So, whether it is making recommendations on potential products such as food choices, clothing, hotels or vacation destinations, a more informed robot can provide a more informed suggestion.

Analyzed over a series of interactions marketers, can refine their offers by augmenting robot captured queries with the key personality traits derived from those accumulated queries. Communications via messages, advertisements and product offers powered by personality-based diagnostics, would supply savvy marketers with an advantage. No matter how small the competitive advantage, it is a knowledge-based advantage that can tip competitive scale.

Extending robot interaction is another. Humans get bored quickly. This is a serious issue with robot engagement. If the human feels disconnected from the interaction by way of generalized robot responses, they simply walk away. However, if the human feels more deeply connected to the dialogue which can now be made personality driven, the likelihood of continued engagement length increases. Thus, given more time to interaction, means more time to sell, suggest or convey a promotional theme or message. More success.

My testing of this technology has served to substantiate its power and advancing validity. If you are a robot developer, feel free to reach out to me for a discussion about early preview access to this technology.

(1) Assessing the Big Five personality traits using real-life static facial images Alexander KachurEvgeny OsinDenis DavydovKonstantin Shutilov & Alexey Novokshonov 

 Mike Radice is Chairman of the Technology Advisory at ChartaCloudRobotics.com and Robotteca.com. You can contact mike at info@chartacloud.com

Friday, July 10, 2020

3 Consternations Developing at the Front Lines of Robotics


Every business modeler understands that defining, strengths weaknesses, threats and opportunities are central to clear and comprehensive strategic planning. Unless the three long game strategic concerns outlined below find their rightful place in that strategic planning model for physical robots, physical robots are headed for a not so pretty inflection point and at the minimum they will face major constraints on long term deployment, viability and use.

Imagine the “robot jam”

Let’s be practical. There is only so much space inside buildings, hospitals, nursing homes, transportation centers, on sidewalks, in retail establishment and yes, particularly in restaurants and homes. If even half the forecasts of ‘future robots to be deployed’ are realized, robots will be crowding out people and running into one another. Simply said, the current model of physical/ mobile robot utilizations is simply not scalable. Worse yet, such large scale deployments will create social-space chaos, bringing in the regulators, licensors, and taxation. Let alone unleash the liability lawyers seeking compensation for robots obstructing and crashing into people and things or standing still to avoid collisions.

Who is Going to Service These Bots, effectively?

Let’s be even more practical. No one can expect that every deployed robot will function over time without failure or damaging incident. Such ‘failures’ as they will assuredly arise may be as simple as needing a battery replacement or more dramatic like retrieving a robot from the bottom of a swimming pool, or collecting one at the bottom of a set of stairs or one stuck stranded and immobile on a sidewalk or in a doorway. How about when some malicious person douses a robot in public with foam or glue spray? Or, when someone just picks it up and steals away with it? No matter, the point is that I have yet to learn of any robot manufacturer’s nationwide model for national on-site monitoring, pickup and repair service. The current model espoused by robot developers and manufacturers places the onus on the customer to monitor, retrieve, diagnose, package and ship for repair. Who is going to take care of and how for all the robot physical/mechanical issues generated by these thousands of forecasted robot deployments and their inherent failure rates?

“Amusing at best”.

The third issue is the human interface expectations established by the physical style of mechanical robots. Most robot design implementations thus far seek to convey a human like motif, a set of attributes that seemingly are designed to convey comfort and familiarity to the interfacing human. They usually have heads, blinking eyes, arms, some have legs. The problem is that in following this path the engagement and response expectations of the robot are set beyond what is proto-typically possible today. Most humans, interfacing with a mechanical robot, soon drift away somewhat amused, maybe, but typically underwhelmed. Truthfully, there are two factors at work. First, most robots when deployed are ‘one horse pony’ demonstrators. I’ve watched people (i.e. customers) walk right past a robot in a public environment and when asked about doing so they state “Oh yeah, I spoke to that robot the other day.  It has nothing new to say.” This is not the robot’s fault as much as the content is usually woefully weak if not silly. Secondly, there is hardly ever a sense that you are actually connecting individually with the robot and being engaged as a unique person in a useful or rewarding conversation. Left as is, robots will remain to be seen as not much more than a gimmick that does dances and takes selfie photo.

This is why smart, AI-powered robots that can engage individuals (detect) emotional conditions and conduct a ‘pathway’ of a logical, in-depth conversation are needed. In summary, my belief is that we need to move away from the mechanical, fixed structural/mechanical robot models so popular today and move to or at least create a new class of what I foresee as ‘soft robots’. Having seen the emerging screen based ‘animated, AI-powered KIOSK creatures’ that can convey engagement and be much more ‘alive’ without the scalability constraints of physical, mechanical platforms I am heartened that it is possible. These soft robots, these ‘artificial creatures’ I predict will be the new interface.

Smart robot developers would be wise to move to these ‘artificial creature’ style interfaces.

These three industry impacting considerations need to be crafted and integrated into a new era solution that creates a future robot world that is much more scalable, manageable, resilient and yes, more satisfying to humans.

Mike Radice is Chairman of the Technology Advisory for ChartaCloud Robotics, https://CHARTACLOUDROBOTICS.com and https://www.ROBOTTECA.com info@chartacloud.com


Monday, May 4, 2020

Are You Watching? 3 Game Changers in Robotics



#1: Putting Humans in the Robot Loop – Game Changer

In these unique times, all things ‘robot’ have begun to move very fast. What business has been resisting about robots for the last decade is fast becoming a priority. Robots previously considered as job killers, all of a sudden look like a brilliant solution to those tasks dull, dangerous, dirty, and toxic. A crisis will have that effect.
The point is that we now need more robots working as fast, efficiently and as effectively as possible. However, the truth is that the world remains a complicated place for a robot. There are problematic times when even a robot needs a helpful human hand. We now realize that injecting human intelligence by positioning a ‘human in the robot loop’ makes a big difference. The requirements for a ‘live’ human-robot interface link is proving to be landscape changing in a positive way to successful robot deployments.
Being able to inject human intelligence into and through a robot via a human-robot interface link especially at the right moment or a critical moment has been found to be critical and highly beneficial. It may be as simple as helping get a robot get back on its ‘map’, resolving the getting around an unforeseen impediment or obstacle or taking over a conversation when an AI-powered retail robot has run out of pre-programmed knowledge and expertise.
Millions of robots are already deployed. And, the number will continue to grow. There are those that predict that robots will at some point outnumber cell phones as the ubiquity of robots increases in the newly emerging economic and social fabric. At present, however, we constantly hear that the robots are not ready for prime time. And, to a great extent, that is true. Reality still does meet expectations. We expect a lot of our robots. For both robot developers and users alike, the stakes are high. Artificial intelligence, machine learning, and deep learning remain essential elements in future robot-based solutions. Adopting the benefits of a ‘human-robot interface link’, a ‘human in the loop’ with the robot via cloud-based software offers an immediate and powerfully functional solution in support of the demand for rapidly expanding the use and deployment of robots.

#2: Cloud-based Robotics Software:  Setting the Stage for Robot Ubiquity – Game Changer

Robot developers are fast coming to understand that relying on the on-board computing power of their hardware platforms, thinking that they can provide fully comprehensive, fully autonomous multi-purpose, multi-functional robots is not within their current grasps. The increasing sophistication of current robots is primarily the result of access to powerful cloud-based computing and software. As a result, a whole new class of software and services providers has evolved, focused on the creation of ‘cloud-based robotics’ software platforms designed to meet the increasing needs for rapid application development, monitoring, controlling and collecting data that analyzes the use of robots in fleets and at scale.
Cloud-Based robotics software will mature in at least these three ways.

A.   Software and services that will allow robot developers to focus their engineering talent and resources on their unique platform attributes while looking to off-the-shelf software to augment their platforms with the non-unique attributes. Using this class of software will lower developmental costs, advance speed to market and increase the reliability and thus the ROI of the robotic platforms.

B.   Introduction of Robot Access Interface Layer (RAIL) software allows ‘the common person’ to use and control robots and create their own personal applications. Controlling the attributes of robot behavior has until recently remained beyond the reach of the population at large. But that is changing as software that can be placed on a robot and interface with its primary functionalities such as speaking, moving, interfacing with other applications, and interfacing with the IOT devices that control homes and monitor health.
C.   Software that ushers in an entirely new concept of robots. Robots that do not need to be embodied as hardware devices on wheels in order to be of service and value. There are now soft robots. If you can imagine an animated, avatar style robot creature that is itself AI smart and is very much as capable of interaction as a hardware-based robot. In this instance we have animated creatures on a screen that are sensitive to touch, can recognize faces, recognize emotions, and dialogue in an engaging fashion. These robot creatures appear and act as if they are themselves alive and you sense that they seem to recognize that you actually exist. More importantly, delivered via information style kiosks …Mirror, Mirror on the wall…vibrantly animated on reactive flexible robot arms these robots are scalable in a future world environment that otherwise if all comes to pass, would be awash in mechanical robots running all over the place and into each other.

#3: Coming Sub 20ms Network Latency – A Game Changer

On April 23rd, 2020 the U.S. FCC approved the allowance of WiFi 6. Increasing the WiFi spectrum means up to 4 times more capacity, 40% increase in data throughput, and increased multi-streaming capacity. Adding 5G telecommunications and network slicing capabilities of software-defined networks, we are pressing to network latency speeds that will challenge human neural networks in speed and thus on to ’seamless’ human-robot communications.
For those of us that have been working in ‘robotics’ these times are proving to be the most energizing yet when you combine the role robots are playing today in fighting the COVIOD-19 virus and the anticipated role that robots will play in our post crisis world.

Michael D. Radice is Chairman of the Technology Advisory Board for ChartaCloud ROBOTTECA,  www.robotteca.com  and www.chartacloudrobotics.com  Mike can be reached by e-mail at mike@chartacloud.com .