Cluster, Classify Videos Using Human Activities & Movements

a new category of video intelligence

 

Technology

Automatic Video Activity Recognition (AVAR)

Patented AI algorithms for human activities & movement recognition are central to linedanceAI.  AVAR API runs over video or video streams to identify and analyze humans pattern of behaviors through their activities and movement.  AVAR's deep-learning capability  in video builds a safer tomorrow for everyone, everywhere.

 

Making Pixels Smart

Billions of videos. Everywhere. Searching, querying or understanding interrelations between videos is unthinkable at scale. linedanceAI's AVAR converts videos into smart pixels by recognizing and clustering human movements to create new insights. No images are stored. Only the movement of body joints to reveal new data.

Classify & Search

Machine-learning models classify human movement by specific activity in a video or stream, and generates annotation for search

Movement Unique As A Fingerprint

Deep-learning algorithms over movement data to generate similarity, of a gait or other signature movements: Identity or movement verification, pattern recognition, or performance analysis.

Predict Behaviors

Learned sequences of movement for corresponding behaviors generates alert indicators to preemptively  identify risk or threat

 

Another side of video data...

In 2013 we  began to develop novel, machine-learning and image processing approaches to automatically learn and analyze human body movement. Early sources of data were sign language movements. Designing algorithms to learn a body movement-based language with high fidelity and a low number samples, gave rise to our AI-as-a-Service algorithm stack for use in video analytics. US Patent number US10628664B2 and US20200167555A1 (Canada, Israel, EPO)

 

If you are curious about our sign language work, visit  Project KinTrans, Hands Can Talk. 

Sign language has been used across technical research domains to develop and enhance machine-learning body movement models.  Through a Microsoft-sponsored project,  linedanceAI owns a dataset of over 600,000 unique sign movements, largest, annotated 3D sign language datasets for body movement analysis.

 

Microsoft, linedanceAI, and the financial services  industry are also working together in the 4th Annual #HackAutism on April 29-30th, 2021. lindeanceAI technology will be utilized to find patterns in stereotypic behaviors characteristic of Autism. Developers from across financial services will work to build a clinical application for diagnosis data and treatment.  

 

Get in Touch

Thanks for submitting!

  • Twitter
  • LinkedIn