More than 90% percent of Deaf and Hard of Hearing (DHH) infants in the US are born to hearing parents. They are at severe risk of language deprivation, which may lead to life-long impact on linguistic, cognitive and socio-emotional development. In this project, we design and develop AR/VR and AI-mediated communication technologies that aims to support context-aware, non-intrusive and culturally-relevant parent-child interaction and collaborative learning of American Sign Language (ASL). The proposed prototype enables empirical studies to collect in-depth design critiques and usability evaluation from domain experts, novice ASL learners, and hearing parents with DHH children.