Symbl for Startups founder profile: Aecho leverages conversation AI to dig into human psychology
This series takes a closer look at companies participating in the Symbl for Startups program. If you have any questions about applying for entry into the program or want to know more about the company featured in this profile, please contact email@example.com.
Imagine effortlessly being able to assess a potential or current employee’s engagement with certain projects, their work style, overall wellbeing, emotional stability and much more.
Aecho is taking behavioral analysis to the next level and fostering novel use cases that go beyond Human Resources. Notably, its product supports open-ended, free-form dialogue, which is unique in the tonal analysis space.
In the following Q&A, CEO and Co-Founder Adam Hocek discusses the bright future of voice interfaces and explains why Aecho’s solutions raise the bar in terms of understanding our fellow human beings through machine learning.
Tell me about how the idea for Aecho came about. What market need does it fulfill?
Adam Hocek: The start of Aecho began with detecting emotions in speech using tonal analysis. From there it expanded to the recognition of psychological and mental health predictors from the tonal analysis of speech.
At the same time, we started to look at converting speech to text and leveraging conversational and other natural language processing techniques.
The initial market we targeted was Human Resources, because our technology could provide insight into a job candidate’s personality and behavioral aspects to provide a more complete picture of the person and how well they would fit into a company’s team and culture.
We realized that we could expand our market opportunities by making our capabilities available through an API, enabling integration with existing HR Tech products and subsequently allowing us to focus on our core technology.
This shift is leading to opportunities in not only the HR Tech space, but in other vertical markets such as mental health care, counseling and many other emotionally and psychologically sensitive applications.
Aecho today can detect, through a combination of speech and text, over 164 different psychological and emotional traits.
When did you discover Symbl.ai’s API, and how did you know it was the right fit for Aecho?
AH: It has been approximately two years since we first started working with Symbl.ai’s API.
There are two things that made it apparent that Symbl.ai was the right fit for Aecho. The conversational capabilities of the API enabled us to immediately focus on our core value offering of psychological and behavioral analysis. We currently use Symbl’s speech-to-text, topic and summary services and plan to use other conversational services as we expand our own service offerings.
The second and very essential reason we felt Symbl.ai was a good fit for us is based on the company and culture. Symbl.ai has always been very supportive of our efforts, from the founders to all levels of the company, the company has helped us with technical and business support along the way.
Symbl.ai’s approach is to give back to the community, help other startups meet their objectives and forge long-term relationships. Small companies working together and helping each other is the way to make great things happen.
Can you explain some of the benefits of human-to-human speech analysis as it pertains to employee oversight?
AH: Human behavior is very complex and is influenced by many factors; most information is unconsciously expressed and perceived. At Aecho, we essentially use machine AI to understand human behavior and help people better understand others. Using human-to-human speech analysis we can recognize many psychological traits and emotions in an individual.
A speech interface allows users to formulate open-ended responses to questions making a natural and rich approach to effortlessly interact and gather information. From these traits their personality, work style, social interaction ability, emotional stability, wellbeing and many other traits can be determined.
Knowing these can help in determining if an employee is engaged or disengaged with a project, has difficulty in coping with certain changes, is satisfied or dissatisfied with their current situation–values that are important to most businesses.
How does Aecho compare to similar software in the employee management space?
AH: Employee management is a broad competitive landscape, with some companies offering strong classic HR services with weaker assessment tools, but there are fewer companies that focus only on assessment tools.
Aecho’s solutions are more aligned with the latter, offering sophisticated assessment tools that do not seek to address all aspects of the HR Tech product features such as onboarding, reviews, compliance, etc.
That said, there are many differences in these assessment solutions and ours. Some have predefined questions, others are based on matching specific criteria for job roles, while others offer customization of questions based on client requirements.
They all offer basic form and text capabilities, but none of them support open-ended responses. Aecho is the only one that offers tonal analysis in addition to text that can support open-ended, free-form responses. This makes our solution more adaptive, cheaper and quicker in terms of making assessments.
Another differentiating factor is the number of emotional, psychological and behavioral traits we can measure. We currently have over 164 traits that we can detect in speech and text. The last significant differentiator and an important one is that we offer access to our emotion and psychological assessments through APIs, making these capabilities accessible to other HR and non-HR products.
Which Aecho features have your customers been most excited about and why?
AH: Our customers like that we provide a very large set of predictors (KPIs) for gathering psychological and emotional data through both text and voice.
We’ve been seeing more initial interest in text, but this reflects that there are many more text solutions out there than voice solutions. However, we see this trend changing as more and more voice interfaces are being added to products.
The other area that we are starting to see customer interest in is for transcripts that include emotional and psychological markers.
How do you see Aecho evolving as the company gains users?
AH: Aecho will expand its current capabilities to include even more predictor traits in emotions, psychological profiles and in mental health and wellness.
These will open up opportunities for many new applications that can bring individual and societal benefits through the simplicity of voice, giving personal insights and, where necessary and with consent, sharing that insight with others.
This interview has been edited for clarity and brevity.