AI technologies are enabling the development of not only active tools that provide decision-support, but also interactive tools that seek human input and feedback. As interactive tools facilitate human-AI interaction, their design needs to be informed by human-centric requirements, that is, the needs of the users of such tools. In the context of engineering design, there is a gap in our understanding of designing intelligent tools that facilitate human-AI interaction. To fill this gap, the research question of this study is, What are the human-centric design requirements for the design of AI agents to enable human-AI interaction in engineering design contexts? To answer this question, we conducted an interview study with faculty members in engineering design. The faculty predominantly discussed engineers, designers, and engineering design students as the potential stakeholders who would directly benefit from human-AI interaction. For such stakeholders, we identify several human-centric design requirements and challenges in designing AI tools that facilitate human-AI interaction in engineering design. We find that the requirements focused on the need to understand the stakeholders’ cognition and the engineering design contexts. The results of our study point to the need for the theory of mind in AI agents to enable them to infer stakeholder preferences while engaging in engineering design activities.