Abstract

As artificial intelligence (AI) assistance tools become more ubiquitous in engineering design, it becomes increasingly necessary to understand the influence of AI assistance on the design process and design effectiveness. Previous work has shown the advantages of incorporating AI design agents to assist human designers. However, the influence of AI assistance on the behavior of designers during the design process is still unknown. This study examines the differences in participants’ design process and effectiveness with and without AI assistance during a complex drone design task using the HyForm design research platform. Data collected from this study are analyzed to assess the design process and effectiveness using quantitative methods, such as hidden Markov models and network analysis. The results indicate that AI assistance is most beneficial when addressing moderately complex objectives but exhibits a reduced advantage in addressing highly complex objectives. During the design process, the individual designers working with AI assistance employ a relatively explorative search strategy, while the individual designers working without AI assistance devote more effort to parameter design.

Introduction

Engineers regularly face complex design tasks that could be completed more rapidly or effectively with artificial intelligence (AI) assistance. When designing complex engineering systems, the task itself requires managing coupled design parameters and multiple interrelated factors, making the design process difficult and stressful [1,2]. To overcome such challenges, AI tools have been implemented to support human designers as they solve complex design problems. AI assistance for design allows human designers to work faster with increased effectiveness and efficiency, therefore improving a company’s competitiveness in today’s fast evolving market. For example, designers have used AI tools to design products and explore the solution space more rapidly [3], and different AI approaches have been used to support different stages of the engineering design process including concept generation [4], concept evaluation [5], prototyping [6], and manufacturing [7]. Moreover, research involving 1500 companies where humans and AI worked together found a significant improvement in their overall performance [8,9]. However, Zhang et al. [10] highlighted the problems in human-AI collaboration by reporting that AI can hinder the performance of teams, especially high-performing ones.

Multiple studies have investigated the implementation of AI to assist engineers with specific design activities, including decision-making, optimization, and computational tasks [11,12]. However, to achieve efficient human-AI collaboration, there must be a better understanding of how AI can enhance human performance and design effectiveness. Amabile [13] argued that researchers should study the impact of AI and computer-assisted human intelligence on the design outcomes of humans, organizations, and society. As much of the previous literature has focused on developing and implementing AI assistance for product and service development, little is known about the effect of AI assistance on design process effectiveness. A sufficient understanding of how AI assistance influences designers and the design process during complex design problems will aid the development and implementation of better AI-based design tools in the future. These AI design tools will not only improve the outcomes of the design process but may also improve designers’ well-being and experiences during the design process.

The following section of the paper presents the relevant literature, followed by the research questions (RQs) for the current study. Next, a description of the research platform and methods used for data analysis is presented. Then, the results section is followed by a discussion section, and the paper ends with an examination of the limitations of this study and future work.

Relevant Literature

AI has been used in various ways to improve new product development [11]. In this era of global competition where there is a constant need for high-quality and appropriately priced products, AI tools can quickly guide designers as they respond to evolving consumer demands. Previous literature investigates AI tools ranging from a knowledge-based tool for ethical engineering design [14] to an AI-based tool to help designers build safer buildings [15]. Improved AI-based tools for design can be used in industrial practice to improve design outcomes [11]. Furthermore, AI tools are facilitating the manufacturing process by developing new models and forms [16]. In architecture, AI expert systems assist designers in identifying potential failures in the design specifications solutions [17]. According to Boden [18], AI can be used to create new ideas by (1) producing novel combinations of familiar ideas, (2) exploring the potential of conceptual spaces, and (3) making transformations that enable the generation of previously impossible ideas.

AI tools can also provide designers with decision-support to select suitable design alternatives [19] and produce solutions by learning from users’ needs [20]. Designers view their collaboration with AI as a potential way to add value to their design practice [21]. This is supported by research that has shown that co-creation practices between humans and AI may result in creative outcomes [22] and affect humans’ social, functional, and behavioral outcomes [23]. However, little can be said about AI’s impact on designers’ cognition during the design process.

Recently, there has been increasing awareness concerning cognitive health as it impacts individuals’ ability to think, learn, and remember [24]. One indicator of cognitive health during complex engineering design tasks is participants’ cognitive load [25]. Cognitive load affects working memory, making it an indicator of the task’s complexity and the participant’s ability to complete the task [25]. A high level of cognitive load may result in human errors [26]. Moreover, cognitive load has also been linked to mental stress [2730]. Improved understanding of how to ease cognitive load could maximize human performance and well-being, ultimately reducing errors.

Although AI has advanced to be an effective tool for task-specialized problem-solving, AI by itself cannot solve complex problems requiring general intelligence or abstraction, such as creativity and intuition [31]. By combining the complementary strengths of AI tools and humans, they can together overcome such limitations [32]. While some research has already shown that combining humans with AI assistance improves performance [8,9,33,34], this work will further investigate the influence of AI assistance on design effectiveness and the design process.

Research Questions.

This study examines how the assistance of an AI agent for drone design impacts the design process and outcomes during an individual design activity (as seen in Fig. 1). To accomplish this, the following RQs are addressed:

Fig. 1
Overview of the current work
Fig. 1
Overview of the current work
Close modal

RQ1: How does AI assistance impact drone design effectiveness across problems of varying complexity?

RQ2: How does AI assistance affect the drone design process of individual designers?

Some of the metrics to evaluate the effectiveness of engineering design outcomes are novelty, exploration (variety), quality (value), and quantity [35,36]. Design effectiveness can be captured using the metrics proposed by Shah et al. [35]. According to Shah et al. [35], novelty measures the unusual nature of an idea when compared to other ideas. Exploration (variety) quantifies the explored solution space during ideation. Quality is assessed in terms of usefulness, indicating a solution’s ability to meet the design requirements. Quantity is the total number of ideas generated. Similarly, in this study, the design effectiveness is evaluated in terms of drone design quantity, quality, exploration, and novelty.

This paper also explores the drone design process in terms of designers’ efforts and search strategies to explore the design space and designers’ cognitive workload during the complex design task. In this work, design effort refers to the number of actions and the amount of time required for a human designer to complete an objective. Some factors that cause a variation in the effort for engineering design tasks include product complexity, technical difficulty, design experience and skill, team size and structure, methods of communication, and use of new technology [37,38]. Among those factors, product complexity is considered to be the dominant factor in determining design effort [38]. In the current work, each designer was randomly assigned to either have or not have access to AI assistance for drone design. Since designers only experienced one of the two experimental conditions, the RQs will be answered by drawing only between-group comparisons.

Data and Methods

In this study, participants are challenged to design drones following specific objectives for range, velocity, payload, and cost using an online research platform called HyForm [1]. Timestamped design actions with the corresponding evolution of design solutions are recorded. Qualitative data collection is conducted using mid- and post-task questionnaires.

Participants.

This institutional review board-approved study recruited undergraduate and graduate students from 18 universities in the United States. Seventy-one participants participated in this study. The survey data from four of these participants were removed from analysis due to technical difficulties and participant non-compliance. Of the 67 participants who were included for survey data analysis, 47 participants identified as male, and 19 participants identified as female (one participant chose not to disclose their gender identity). Participant ages ranged from 18 to 36 years of age (one participant chose not to disclose their age), with the median age being 22 years. When asked how participants racially/ethnically identified, 33 identified as white, 27 identified as Asian, two identified as black or African American, two identified as multiracial, and three chose not to answer.

Most participants had limited engineering design experience with six participants reporting having never done design, 22 considered themselves beginners, 33 with intermediate experience, five experienced, and one expert. Participants had reported similar experience with computer-aided design (never = 5, beginner = 13, intermediate = 32, experienced = 16, and expert = 1). When asked about their experience with computer-based simulations, nine reported having never used them, 29 used them one to two times, 12 used them three to five times, and 17 used them more than five times. Additionally, an overwhelming majority of participants reported that they were not professionals in the areas of building (N = 62) and operating (N = 57) drones. All participants completed two consecutive 20-min design sessions and were compensated with a $20 USD e-card for completing the study.

HyForm Experimental Research Platform.

In this experiment, participants were asked to design drones with specific design requirements and fulfill the design objectives using an online collaborative design environment that partners AI agents and humans called HyForm2 [1]. Although created for multi-discipline and multi-user collaborative design experimentation, in this work it is used to study individual designers only focusing on drone design. As seen in Fig. 2(a), HyForm allows users to assemble and evaluate drones. Drones can be created using four types of components, and each has a variety of size options: (1) battery with 65 different sizes, (2) clockwise motor and rotor pair, (3) counterclockwise motor and rotor pair, each with 50 different sizes, and (4) airfoil with 100 different sizes. A large drone space is enabled by various drone configurations with distinct component sizes.

Fig. 2
HyForm drone design interface: (a) basic drone design configuration and (b) drone designs returned by the AI design agent
Fig. 2
HyForm drone design interface: (a) basic drone design configuration and (b) drone designs returned by the AI design agent
Close modal

At the beginning of the study, all participants start with a same basic drone design, as shown in Fig. 2(a). Starting from the basic drone, participants can explore the design space by adding or removing components with connections and changing component sizes. After building a drone, participants can evaluate its performance in terms of range, velocity, cost, and payload. HyForm records all the drone design actions, drone configurations, and performance metrics. Additionally, the platform has a chat tool that allows participants to communicate with an experimenter when they encounter problems during the study.

Additionally, HyForm incorporates an AI design agent, which aids designers in creating drones. This design agent can recommend multiple drone designs by exploring the neighborhood of the current valid drone design, each optimizing one performance metric, such as range, velocity, cost, and payload, as seen in Fig. 2(b). The recommended drones are from an extensive preestablished drone database, which is produced through a generative design algorithm that employs a character-based recurrent neural network (Char-RNN) and represents each drone design using a string grammar [1,39,40]. The string grammar defines all drone features such as the configuration of the two-dimensional layout, component types, and component sizes. In each iteration of the generating process, the Char-RNN is trained on a set of valid drone designs, which is updated iteratively by including newly generated higher-performance designs from the current iteration to replace the lower-performance designs in the training dataset. On this basis, the designs returned by the AI agent are selected from the Pareto front of the generated drone database to maximize range, payload, and velocity, and minimize cost. Therefore, the design AI agent is trained to recommend high-performance designs. By manipulating participants’ access to the AI agent, individual designers are allowed to work with AI assistance (i.e., AI-assisted designers) or without AI assistance (i.e., solo designers), respectively.

Design Problem and Procedure.

Participants were randomly assigned as AI-assisted (N = 38) or solo (N = 33) designers. At the beginning of the study, participants received documents describing how to use HyForm to design drones, a problem brief explaining the experimental task, and a document with the drone design objectives they were required to meet. The five drone development objectives were tailored by the researchers to be challenging to finish in the allotted time (Table 1). Additionally, each subsequent objective was designed to be more complex than the previous one, and participants were given instructions that the objectives needed to be completed in sequential order.

Table 1

Drone design objective requirements

Objective and sub-objectivesRange (mi)Payload (lbs)Cost ($)Velocity (mph)
1  1a≥20
  1b≥50
  1c≥10
  1d≥20
2  2a≥15≥20≥5
  2b≥20≥15≥5
  2c≥30≥10≥5
  2d≥40≥5≥5
3  3a≥18≥25≤5k
  3b≥35≤5k≥8
  3c≥20≤5k≥10
4  4a≥18≥15≤4k≥10
  4b≥20≥8≤4k≥8
  4c≥30≥10≤4k≥6
  4d≥40≥5≤4k≥6
5  5a≥18≥15≤3.5k≥8
  5b≥20≥10≤3.5k≥15
  5c≥30≥20≤3.5k≥6
Objective and sub-objectivesRange (mi)Payload (lbs)Cost ($)Velocity (mph)
1  1a≥20
  1b≥50
  1c≥10
  1d≥20
2  2a≥15≥20≥5
  2b≥20≥15≥5
  2c≥30≥10≥5
  2d≥40≥5≥5
3  3a≥18≥25≤5k
  3b≥35≤5k≥8
  3c≥20≤5k≥10
4  4a≥18≥15≤4k≥10
  4b≥20≥8≤4k≥8
  4c≥30≥10≤4k≥6
  4d≥40≥5≤4k≥6
5  5a≥18≥15≤3.5k≥8
  5b≥20≥10≤3.5k≥15
  5c≥30≥20≤3.5k≥6

After reading the problem brief and platform instructions, participants were given an initial 20-min session to attempt as many drone development objectives as possible. After the first session was complete, participants were given a break and were asked to complete a mid-task questionnaire. Then, participants were instructed to continue finishing the drone development objectives. The second session was again 20 min in duration and was followed by a post-task questionnaire to complete the experiment. Among all the participants, only one AI-assisted designer did not design and submit drones as required by the instructions. Thus, the data of 37 AI-assisted designers and 33 solo designers were analyzed.

The mid-task questionnaire included demographic questions, previous relevant experience questions, a question querying the number of objectives completed, and an expanded version [29,30] of the NASA-Raw Task Load Index (NASA-RTLX) [41,42]. The NASA-RTLX was used to evaluate each participant’s perceived mental workload and unlike the traditional NASA-TLX, participants did not weigh each of the subscales in the NASA-RTLX. The post-task questionnaire contained the same question querying the objectives completed and the expanded version of the NASA-RTLX. Demographic questions included age, gender, and race/ethnicity. Previous experience questions included previous experience with engineering design, computer-aided design, computer-based simulation, operating drones, and building drones. Details about the experimental design, procedure, and corresponding data collection and analysis for this study can be found in Ref. [43].

Measures.

In addition to the questionnaires, detailed design action and outcome data were collected directly using HyForm. This data log enables various data-driven analyses on design effectiveness and design process to answer the proposed RQs. Specifically, design effectiveness is evaluated through the number of drones submitted and drone design exploration, novelty, and quality scores. The design process investigation includes design effort, search strategy, and mental workload.

Exploration and Novelty.

Drone designs are evaluated through exploration and novelty scores. The metrics reflect how a specific drone differs from the basic drone or all other drones. The calculation of both exploration and novelty is based on the similarity between a pair of drones. In HyForm, the components of a drone are arranged at squared grid nodes. Design similarity is measured according to a vector representation of drones (Fig. 3(a)), which is described in four steps. First, each drone is represented by two n × n matrices, indicating the component type at each grid node while the other indicating the corresponding normalized component size (Eq. (1)):
Normalizedcomponentsize=componentsizecomponentsizemincomponentsizemaxcomponentsizemin
(1)
Fig. 3
Calculation of exploration and novelty: (a) vector representation of drone designs, (b) possible orientation of a drone, and (c) drone examples
Fig. 3
Calculation of exploration and novelty: (a) vector representation of drone designs, (b) possible orientation of a drone, and (c) drone examples
Close modal
Second, the two matrices are converted into two n × n vectors by traversing them column-wise. Third, the two vectors are converted to a 3n × n vector by extending one element in the component size vector to three consecutive elements, each representing a distinct component category. Among the three consecutive elements, the corresponding value in the component type vector indicates the position of the corresponding normalized component size value, and all values at the other positions are 0. Till now, each drone can be represented as a 3n × n vector. Fourth, cosine similarity, the cosine value of the angle between the two vectors respectively representing two drones, is calculated for pairwise drone similarity. Since the same drone can be arranged in different orientations, resulting in different vectors, the drones are reoriented to maximize the cosine value between a pair of drone vectors (e.g., Vi and Vj) when calculating similarity. As a valid drone (Vi) exhibits either point-symmetric or axisymmetric structures,3 rotating the original drone, respectively, by 90 deg (Vi−90), 180 deg (Vi−180), and 270 deg (Vi−270) will obtain all possible arrangements of the same drone, as shown in Fig. 3(b). The unified similarity is the maximal value of all the possible values (Eq. (2)):
Similarityi,j=max(cos(Vi,Vj),cos(Vi90,Vj),cos(Vi180,Vj),cos(Vi270,Vj))
(2)
Figure 3(c) shows a few example drones including the basic drone and their corresponding vector representations. According to the drone configurations and vectors, the ranking of the similarity scores between the example drones and the basic drone is drone 1 > drone 2 > drone 3. Similarly, drone 1 is more similar to drone 2 compared to drone 3. On this basis, design exploration of a drone is calculated as one minus the cosine similarity between this drone and the basic drone (Eq. (3)):
Explorationi=1similarityi,basic
(3)
The novelty of a drone is calculated as one minus the average cosine similarity between this drone and all the other drones designed by the participants (Eq. (4)):
Noveltyi=1jiNsimilarityi,jN1
(4)

Drone Quality.

The quality of a drone design is calculated according to its performance metrics. In HyForm, a drone’s performance is evaluated using four metrics, cost, range, payload, and velocity. In general, drone designers aim to design drones that exhibit high ranges, payloads, and velocities with costs as low as possible. A utility function is built accordingly to calculate the overall drone quality based on these four metrics. Drone quality is calculated through Eq. (5) and then normalized by Eq. (6):
Quality0=range×velocity×payloadcost
(5)
Quality=quality0min(quality0)max(quality0)min(quality0)
(6)

Hidden Markov Model.

A hidden Markov model (HMM) is a statistical model to capture hidden temporal patterns from sequential observations (e.g., in this work, team actions) [4446] and is particularly useful for assessing design behavior [44,4750]. An HMM models a system as a Markov process transitioning between a finite number of discrete states hidden from the observer with unknown parameters. The training process of an HMM determines the hidden parameters, the transition matrix, and the emission matrix from the observable parameters. Specifically, the transition matrix has a size of [m × m] where m is the number of hidden states, containing the probability of transitioning from the current state to a future state. The emission matrix has a size of [m × n], where n is the number of unique actions, containing the probability of an action being emitted from a given state. The relations are summarized in Fig. 4. Through the training process, the hidden states represent the underlying cognitive or procedural states that the participants transition through during the experiment [44].

Fig. 4
Hidden states and observed actions in an HMM
Fig. 4
Hidden states and observed actions in an HMM
Close modal

An HMM is employed to investigate the influence of AI assistance on designers’ aggregate design process using design action data. In this study, each designer’s design process is treated as a sequence of design actions, and the whole data set consists of 70 samples. The Baum–Welch algorithm [45] is employed to train the HMM by maximizing the observations’ likelihood. Since the ideal number of hidden states to use for modeling the aggregate problem-solving process is unknown, several models with the values of m varying from 1 to 11 (the number of unique actions) are trained and compared for selecting the best model. Higher values of m are not necessary for maintaining the independence of the emission probabilities of the states. For any m values within the given range, models are trained 70 times each with 69 samples; leaving one of the samples out as the testing sample to increase the generalizability of the results. The average testing log-likelihood (i.e., an indicator of the model’s ability to describe the test sample) is calculated over the varying m values. Then, the best model is identified by selecting the value of m for which the test log-likelihood is highest. The captured hidden states are interpreted as categories of design activities in this study.

Results

All designers submitted a total of 585 drones, each aiming to fulfill a specific objective. Since a same drone may meet distinct objectives and be submitted multiple times, the submitted drone set contains 296 unique drones. The submitted drone designs are categorized into four groups according to the objective categories they meet (no submitted drone meets objective 5 in Table 1). Each unique drone is only counted once in the highest objective category. The drone designs in a specific objective category satisfy at least one sub-objective in this objective category. While they may also satisfy any prior objectives, they do not meet any subsequent objectives. In Fig. 5, the drone similarity network provides an overview of the drone space explored by the participants during the study.

Fig. 5
Solution space defined by the submitted drones meeting objectives 1–4. The highest objective met by a drone design is informed by the node color intensity, and the frequency of a drone being submitted is indicated by the node size. The distance between a pair of nodes indicates the pairwise dissimilarity between the corresponding drones. Links between nodes are removed for clarity. The evolution path of the two representative designers and the representative drone designs are also shown.
Fig. 5
Solution space defined by the submitted drones meeting objectives 1–4. The highest objective met by a drone design is informed by the node color intensity, and the frequency of a drone being submitted is indicated by the node size. The distance between a pair of nodes indicates the pairwise dissimilarity between the corresponding drones. Links between nodes are removed for clarity. The evolution path of the two representative designers and the representative drone designs are also shown.
Close modal

In the network, each node represents a unique drone design submitted by the designers (Fig. 4). The node color intensity informs the highest objective met by the drone design, and the node size indicates how many times a drone design is submitted. The distance between each pair of drones informs the dissimilarity between them; a shorter distance indicates higher similarity. The entire layout of the network is determined by the pairwise similarities between the corresponding drones. The links of the network are not shown for clarity. A few drone examples from the design space are presented. The highlighted region in the network labeled No Foil covers all drone designs without foils (including the basic drone) and splits the space into two predominant areas. Area 1 consists of drones that are less similar to the basic drone with a larger number of foils and mainly generated by AI-assisted designers. In comparison, area 2 incorporates drones that are more similar to the basic drone with a smaller number of foils and primarily developed by the solo designers. The evolution paths of the only two designers (one AI-assisted designer and one solo designer) who fulfilled all the objectives 1–4 are highlighted in the network.

The following subsections explore the solution space depicted by the network quantitatively to assess the designers’ design effectiveness and design process.

Design Effectiveness.

First, design effectiveness is measured via the quantity, exploration, novelty, and quality of the submitted drone designs.

Quantity.

On average, AI-assisted designers tend to submit a greater number of drones (t = 1.969, p = 0.053, d = 0.469),4 as shown by Fig. 6(a). Considering that one drone design can be submitted for multiple sub-objectives and different drone designs can share the same drone configuration (i.e., have the same component type at each same position) with different component sizes, the number of unique drone designs and drone configurations are further compared. As seen in Figs. 6(b) and 6(c), the AI-assisted designers submit significantly more unique drone designs (t = 2.295, d = 0.548, p = 0.025) and unique drone configurations (t = 4.069, d = 0.979, p < 0.001). These results inform that the AI agent assists designers in diversifying their solutions to fulfill the design objectives.

Fig. 6
Average number of submissions: (a) submitted drones, (b) unique drone designs, and (c) unique drone design configurations
Fig. 6
Average number of submissions: (a) submitted drones, (b) unique drone designs, and (c) unique drone design configurations
Close modal

As shown in Fig. 7, the AI-assisted designers submit more drones that meet sub-objectives in objectives 2–4 compared to the solo designers, with a more prominent effect on moderately complex objectives 2 and 3 (objective 2: Z = 1.901, p = 0.046, r = 0.317; objective 3: Z = 3.551, p < 0.001, r = 0.609; and objective 4: Z = 1.049, p = 0.036, r = 0.177). Conversely, the solo designers submitted more drones that only meet objective 1 sub-objectives (Z = 2.143, p = 0.026, r = 0.373). However, the AI-assisted and solo designers, respectively, accomplish 3.8 and 3.54 out of 4 sub-objectives of objective 1 on average. Therefore, the fewer unique AI-assisted drone designs for objective 1 are likely because AI-assisted designers act more proactively to design drones that aim at less complex sub-objectives (e.g., objective 1) and happen to meet the requirements of more complex sub-objectives (e.g., objective 2 or 3). In contrast, solo designers tend to create drone designs that do not exceed the given sub-objective requirements.

Fig. 7
The average number of unique drones meeting the given objectives
Fig. 7
The average number of unique drones meeting the given objectives
Close modal

Exploration and Novelty.

Exploration and novelty, as previously defined (Eqs. (2) and (3)), measure the differences between a submitted drone with the basic drone (i.e., the common starting point of all designers) and all the other submitted drones, respectively.

As seen in Figs. 8(a) and 8(b), the AI-assisted drone designs present significantly higher exploration (objective 1: Z = 2.973, p < 0.001, r = 0.308; objective 2: Z = 8.446, p < 0.001, r = 0.765; objective 3: Z = 5.087, p < 0.001, r = 0.562) and novelty (objective 2: Z = 3.057, p < 0.001, r = 0.520; objective 3: t = 2.114, p = 0.037, d = 0.429) scores than the solo drone designs for objectives 1–3 (except the novelty score for objective 1), with more prominent influences seen for objectives 2 and 3.

Fig. 8
Dissimilarity metrics of drone designs: (a) exploration and (b) novelty
Fig. 8
Dissimilarity metrics of drone designs: (a) exploration and (b) novelty
Close modal

However, the effect of AI assistance is not significant for the high complexity of objective 4. Moreover, the evolution from objective 1 to objective 4 shows that under the AI-assisted condition, the drone exploration and novelty scores rise substantially first and then converge towards a moderate value as the designers approach the more complex objectives. Under the solo condition, the drone exploration and novelty scores mildly rise and incrementally approach the scores of the AI-assisted condition for solving the highly complex objective 4. These results are consistent with the distribution of the drones under each objective category in the solution space (Fig. 5).

Quality.

Drone quality results are presented in Fig. 9(a). AI assistance significantly improves the drones’ quality scores, satisfying objectives 1, 2, and 3 (objective 1: Z = 5.012, p < 0.001, r = 0.520; objective 2: Z = 6.093, p < 0.001, r = 0.552; objective 3: t = 2.997, p = 0.003, d = 0.660; and objective 4: t = 1.942, p = 0.061, d = 0.684). However, there is no significant difference in drone quality between the experimental conditions for objective 4. Furthermore, an apparent increase in drone quality is seen as the solo designers approach higher objectives from objective 1, indicating that incremental skill/learning is accumulated. In contrast, the AI assistance enables the designers to make more significant improvements in drone quality as they approach the moderately complex objectives and maintain their performance for the following objectives. When the designers reach highly complex objective 4, the accumulated skill/learning weakens the advantage of using AI assistance.

Fig. 9
Drone design quality: (a) quality and (b) correlations between drone quality and foil utilization
Fig. 9
Drone design quality: (a) quality and (b) correlations between drone quality and foil utilization
Close modal

The evolution trends over objectives 1–4 shown in Figs. 8(a) and 9(a) imply that the solo designers exhibit an incremental learning process in general. They tend to extend their exploration scope and improve the drone quality only when the increasingly more complex design objectives drive it. In contrast, the AI-assisted designers expand their exploration and improve drone quality proactively during the moderately complex objectives. In summary, AI assistance is more beneficial for moderately complex objectives and has a reduced impact on highly complex objectives.

Moreover, when comparing the example drones shown in Fig. 5, the drones designed by the AI-assisted designers (primarily located in area 1) differ from the drones designed by the solo designers (pervading the No Foil region and area 2). Specifically, the drones designed by AI-assisted designers are more likely to incorporate a higher number of foils—thus, AI assistance facilitates foil utilization. Figure 9(b) suggests that drone quality is positively correlated with the number of foils incorporated by a drone for both sets of drones designed by the AI-assisted and solo designers, respectively. The results, in line with literature on the design of drones [51], help to explain why and how AI assistance improves drone quality.

Design Process.

The log data from HyForm and the questionnaire data are analyzed to understand the design process with and without AI assistance.

Design Effort.

The effort invested by the designers during the design process is measured by the number of actions and the average time taken for designing a drone in each objective category based on what each designer has achieved. The results are presented in Figs. 10(a) and 10(b). The most evident differences caused by the AI assistance are seen for objectives 1 and 2. The AI assistance enables a higher action efficiency at the beginning (e.g., objective 1) for the AI-assisted designers (Z = 2.532, p = 0.033, r = 0.283), while the solo designers achieve higher time efficiency than the AI-assisted designers for objective 2 (Z = 2.190, p = 0.033, r = 0.255). Although the AI-assisted designers perform fewer actions (Fig. 10(a)), they did not spend significantly less time designing a drone for objective 1 (Fig. 10(b)), indicating that the AI-assisted designers took more time making design decisions rather than performing more actions.

Fig. 10
Design efforts: (a) average action count and (b) average amount of time for designing a drone in each objective category
Fig. 10
Design efforts: (a) average action count and (b) average amount of time for designing a drone in each objective category
Close modal

Moreover, considering the evolution over objectives 1–4 in Figs. 10(a) and 10(b), it can be seen that the solo designers take more design actions to successfully design single drones satisfying objective 1. Then, they gain higher action and time efficiencies in designing drones for objective 2. In contrast, AI-assisted designers do not exhibit improved action and time efficiencies in the transition. The solution space evolution visualized in Fig. 5 explains this: in the transition from objective 1 to objective 2, the solo designers make shorter moves; in comparison, the AI-assisted designers navigate to relatively distant areas with quite different drone configurations, which requires more design efforts. Finally, solo designers reach similar action and time efficiencies to the AI-assisted designers as the design objectives got more complex. The results further support that the AI influences decrease as the design objective complexity increases and as participants become more familiar with the design task.

Search Strategy.

In this section, the designers’ search strategy is compared between the AI-assisted and solo design conditions by looking at designers’ behavior and design evolution process. The designers’ action data are analyzed using an HMM [44,45] to assess their behavior during the design process.

The training process of the HMM results in an optimal model of four hidden states, representing four categories of emergent design activities that illustrate the aggregate design process. These states are used to understand how participants complete the drone design tasks. The transition matrix between the detected hidden states and the emission matrix between the hidden states and the design actions are shown in Fig. 11(a). According to the associations between the hidden states and the design actions (Fig. 11(b)), these four states are named as state 1: drone configuration design (changes to the component type), state 2: evaluation and management of drone designs, state 3: drone parameter design, and state 4: drone configuration design (adding or removing components). The prominence of these states is compared in 10-min periods between the two conditions, as shown in Fig. 11(c). The AI-assisted designers allocate more action resources toward evaluating and managing drones and fewer action resources toward drone parameter design than the solo designers (state 2—period 1: Z = 1.371, p = 0.160, r = 0.228; period 2: Z = 2.473, p = 0.052, r = 0.407; period 3: Z = 3.359, p = 0.013, r = 0.568; period 4: Z = 3.061, p = 0.011, r = 0.510. State 3—period 1: t = 1.371, p = 0.175, d = 0.223; period 2: t = 2.473, p = 0.016, d = 0.378; period 3: t = 3.061, p = 0.003, d = 0.590; period 4: t = 3.359, p = 0.001, d = 0.520). Herein, “parameter design” refers to changing component sizes, while “configuration design” represents changing drone configuration by adding or removing components and changing component type. The solo designers focus more on parameter design for their exploration compared to the AI-assisted designers.

Fig. 11
HMM describing the aggregate design process. (a) The transition matrix between the hidden states and the emission matrix between the hidden states and the design actions. (b) The associations between the hidden states and the design actions. Only associations with probabilities higher than 0.005 are shown. (c) Evolution of the composition of designers’ activities over time. Each subplot represents the percentage of the actions associated with the corresponding hidden state.
Fig. 11
HMM describing the aggregate design process. (a) The transition matrix between the hidden states and the emission matrix between the hidden states and the design actions. (b) The associations between the hidden states and the design actions. Only associations with probabilities higher than 0.005 are shown. (c) Evolution of the composition of designers’ activities over time. Each subplot represents the percentage of the actions associated with the corresponding hidden state.
Close modal

Next, the size of a stepwise move is assessed, as defined by the dissimilarity between two consecutive drone designs generated by a same designer. Note that the moves made by the AI-assisted designers are separated into two groups, the moves made by human designers and the moves made by the AI agent (examples shown in Fig. 12(a)). As shown in Fig. 12(b), the AI-assisted designers, on average, made similar stepwise moves as the solo designers, while the AI agent makes bigger moves than both AI-assisted designers (Z = 28.352, p < 0.001, r = 0.771) and solo designers (Z = 32.609, p < 0.001, r = 0.800). These results indicate that the AI assistance enables human designers to expand their exploration in the solution space more efficiently. Moreover, these results explain the differences between the two conditions in solution space exploration (Fig. 5) and the drone exploration and novelty scores (Fig. 8). In summary, with similar action efforts (Fig. 11(c)), the AI-assisted designers employ a more explorative search strategy and obtain more unique drone design configurations (Fig. 6(c)). In comparison, the solo designers focus more on local optimization through parameter design (Fig. 11(c)).

Fig. 12
Design search strategy indicated by stepwise moves: (a) examples of stepwise moves and (b) average stepwise move
Fig. 12
Design search strategy indicated by stepwise moves: (a) examples of stepwise moves and (b) average stepwise move
Close modal

Mental Workload.

Differences in the mental workload of the AI-assisted and solo designers could explain the differences in their design processes. The mental workload for each participant is determined by calculating their average mental workload score for all NASA-RTLX measures. Figure 13 shows the variability in each of the NASA-RTLX dimensions by design condition.

Fig. 13
Mental workload measures
Fig. 13
Mental workload measures
Close modal

The average mental workload score for the two sessions is used for the AI-assisted designers (M = 55.31, SD = 11.41) and the solo designers (M = 58.87, SD = 12.12). No significant difference in average mental workload is found for the AI-assisted designers compared to the solo designers (t = −0.892, p = 0.376, d = 0.11). Results do not change when the expanded measures from the workload questionnaire are included in the analysis. This result indicates that the novel use of AI assistance for drone design does not increase designers’ mental workload during the task. This result contradicts previous research, which has suggested that AI can reduce participants’ mental workload [52,53] but aligns with a more recent study that found mental workload is not impacted by cognitive assistance for simple tasks [54,55].

Discussion

Individual designers were tasked with completing increasingly complex drone design objectives either with or without AI assistance, using the design research platform, HyForm. The log data from HyForm records the details for every action and is analyzed to understand the influences of the AI assistance on design effectiveness and process. In terms of design effectiveness, the AI assistance enables the AI-assisted designers to achieve more unique drone designs and configurations (Figs. 6(b) and 6(c)). Furthermore, over objectives 1–4, the AI assistance has the most prominent effects on the moderately complex objectives 2 and 3 but has reduced, i.e., not statistically significant, effects on the highly complex objective 4, as indicated by the comparison in drone quantity, exploration, novelty, and quality metrics between the two conditions. These findings complement the findings from a prior study showing that the same design agent improves team performance in designing and operating a drone fleet [39].

With regard to the design process, the AI-assisted designers start the design process more efficiently, whereas the solo designers take more action effort per drone (Fig. 10(a)). As the designers become familiar with the solution space during objective 1 and approach objective 2, the AI-assisted designers act proactively to explore new areas (Figs. 5, 8, and 9). In comparison, the solo designers tend to expand their exploration conservatively and incrementally, relying more on local optimization (Figs. 5, 8, and 9). Hence, the large drops in design effort (in terms of action count and amount of time to build a drone) are seen in the transition from objective 1 to objective 2 for the solo designers but not for the AI-assisted designers (Fig. 10). Accordingly, the AI-assisted designers allocate less action effort for parameter design (Fig. 11(c)) and make bigger stepwise moves with the AI assistance (Fig. 12(b)) than the solo designers. These findings indicate their differences in search strategies. By utilizing the drone designs returned by the AI agents, the AI-assisted designers are more efficient and confident to explore new drone configurations, which remain challenging for the solo designers. Therefore, the AI-assisted designers employ a more explorative search using the AI agent, whereas the solo designers prefer optimization-driven search through parameter design. However, when examining the paths of the two best-performing designers, one AI-assisted and one solo designer (Fig. 5), observation suggests that both designers properly integrate the two search strategies to complete all the objectives 1–4.

The findings in this paper inform the development of design AI assistance from two perspectives. First, design AI assistance, which can enable a more efficient and broader exploration in the solution space, potentially benefits the design process and outcomes. This provides a direction for conceiving new design AIs. Second, addressing design objectives with varying complexity requires efficient adaptation of the strategy. The current AI assistance facilitates the adaptation of the incremental optimization-driven search strategy towards an explorative search strategy, which is most beneficial for moderately complex objectives. However, the AI assistance exhibits a weakened, not statistically significant, impact on the adaptation towards a proper integration of the two search strategies, which is likely required for solving highly complex objectives, such as objective 4. With the limitations of the current AI agent in mind, this study is hoped to encourage researchers to explore AI’s characteristics beyond capability, such as adaptability to problem complexity, and to develop advanced AI agents able to overcome similar limitations.

Limitations and Future Work

While the results of this work are promising, this study does have some limitations. This study’s primary limitation was the use of only one type of AI assistance for design, which may limit the generalizability of these results to other types of design tasks. Moreover, limited qualitative data were collected during this study to understand the participants’ experiences during the design tasks. When analyzing the data, the differences in demographic dimensions and the participants’ mental or physical state prior to the beginning of the experiment were not considered, which might have affected participants’ cognitive demand, effort, attitudes towards designing, and their behavior. Additionally, while the measures used for design effort and mental workload were appropriate for this online experiment, future work should include additional measurements including observation and in-person data collection mechanisms.

Future research should be conducted to verify and expand the results of this study. Qualitative analysis and user interviews should be incorporated in future work to draw inferences regarding human designers’ trust in AI design solutions, design effort, and mental workload. As the current work used only one type of AI assistance for design, in the future, a broader selection of AI design assistance capabilities should be explored to investigate and compare the generalizability of these results to other types of design and collaboration tasks. These efforts are essential to the design of effective collaborative AIs. Furthermore, future studies could extend the current work by drawing comparisons between expert and novice designers to highlight the benefits and limitations of AI design assistance.

Conclusion

Through a design experiment asking participants to fulfill increasingly complex design objectives, this study shows that the effect of AI assistance on design effectiveness varies with the complexity of the task. The AI-assisted and solo designers use different search strategies where the former engage in more explorative search with the AI assistance, while the latter pursues more optimization-driven search through parameter design.

Footnotes

3

The drones are strictly symmetric in terms of component type and approximately symmetric in terms of component size.

4

The statistics in this paper are obtained using either parametric independent t-tests or non-parametric Mann–Whitney U tests, respectively, for distributions following and not following normal distribution. For parametric tests, the t-statistic (t), p-value (p), and Cohen’s D (d) are reported; for non-parametric tests, the Z-statistic (Z), p-value (p), and effect size (r) are reported.

Acknowledgment

The authors are appreciative of the welcoming response from the design research community to help recruit participants for this study. This material is based upon work supported by the Defense Advanced Research Projects Agency through cooperative agreement N66001-17-1-4064. Any opinions, findings, and conclusions or recommendations expressed in this paper are those of the authors and do not necessarily reflect the views of the sponsors.

Conflict of Interest

There are no conflicts of interest.

Data Availability Statement

The datasets generated and supporting the findings of this article are obtained from the corresponding author upon reasonable request. The data and information that support the findings of this article are freely available.5

References

1.
Song
,
B.
,
Soria Zurita
,
N. F.
,
Zhang
,
G.
,
Stump
,
G.
,
Balon
,
C.
,
Miller
,
S. W.
,
Yukish
,
M.
,
Cagan
,
J.
, and
McComb
,
C.
,
2020
, “
Toward Hybrid Teams: A Platform to Understand Human–Computer Collaboration During the Design of Complex Engineered Systems
,”
Proceedings of the Design Society: Design Conference
,
Cavtat, Croatia
,
Oct. 26–29
, Vol. 1, pp.
1551
1560
.
2.
Soria Zurita
,
N. F.
, and
Tumer
,
I. Y.
,
2017
, “
A Survey: Towards Understanding Emergent Behavior in Complex Engineered Systems
,”
Proceedings of the ASME Design Engineering Technical Conference
,
Cleveland, OH
,
Aug. 6–9
, p. V007T06A015.
3.
Koch
,
J.
, and
Paris-Saclay
,
I.
,
2017
, “
Design Implications for Designing With a Collaborative AI
,”
AAAI Spring Symposium Series
,
Palo Alto, CA
,
Mar. 27–29
, pp.
415
418
.
4.
Camburn
,
B.
,
Arlitt
,
R.
,
Anderson
,
D.
,
Sanaei
,
R.
,
Raviselam
,
S.
,
Jensen
,
D.
, and
Wood
,
K. L.
,
2020
, “
Computer-Aided Mind Map Generation Via Crowdsourcing and Machine Learning
,”
Res. Eng. Des.
,
31
(
4
), pp.
383
409
.
5.
Camburn
,
B.
,
He
,
Y.
,
Raviselvam
,
S.
,
Luo
,
J.
, and
Wood
,
K.
,
2020
, “
Machine Learning-Based Design Concept Evaluation
,”
ASME J. Mech. Des.
,
142
(
3
), p.
031113
.
6.
Dering
,
M. L.
,
Tucker
,
C. S.
, and
Kumara
,
S.
,
2018
, “
An Unsupervised Machine Learning Approach to Assessing Designer Performance During Physical Prototyping
,”
ASME J. Comput. Inf. Sci. Eng.
,
18
(
1
), p.
011002
.
7.
Williams
,
G.
,
Meisel
,
N. A.
,
Simpson
,
T. W.
, and
McComb
,
C.
,
2019
, “
Design Repository Effectiveness for 3D Convolutional Neural Networks: Application to Additive Manufacturing
,”
ASME J. Mech. Des.
,
141
(
11
), p.
111701
.
8.
Daugherty
,
P. R.
, and
Wilson
,
H. J.
,
2018
,
Human + Machine: Reimagining Work in the Age of AI
,
Harvard Business Press
,
Boston, MA
.
9.
Wilson
,
H. J.
, and
Daugherty
,
P. R.
,
2018
, “
Collaborative Intelligence: Humans and AI Are Joining Forces
,”
Harvard Bus. Rev.
,
96
(
4
), pp.
114
123
.
10.
Zhang
,
G.
,
Raina
,
A.
,
Cagan
,
J.
, and
McComb
,
C.
,
2021
, “
A Cautionary Tale About the Impact of AI on Human Design Teams
,”
Des. Stud.
,
72
, p.
100990
.
11.
Rao
,
S. S.
,
Nahm
,
A.
,
Shi
,
Z.
,
Deng
,
X.
, and
Syamil
,
A.
,
1999
, “
Artificial Intelligence and Expert Systems Applications in New Product Development—A Survey
,”
J. Intell. Manuf.
,
10
(
3
), pp.
231
244
.
12.
Raina
,
A.
,
Cagan
,
J.
, and
McComb
,
C.
,
2019
, “
Transferring Design Strategies From Human to Computer and Across Design Problems
,”
ASME J. Mech. Des.
,
141
(
11
), p.
114501
.
13.
Amabile
,
T. M.
,
1998
,
How to Kill Creativity
,
Harvard Business School Publishing
,
Boston, MA
.
14.
Sekiguchi
,
K.
, and
Hori
,
K.
,
2020
, “
Organic and Dynamic Tool for Use With Knowledge Base of AI Ethics for Promoting Engineers’ Practice of Ethical AI Design
,”
AI Soc.
,
35
(
1
), pp.
51
71
.
15.
Selin
,
J.
, and
Rossi
,
M.
,
2018
, “
The Functional Design Method for Buildings (FDM) With Gamification of Information Models and AI Help to Design Safer Buildings
,”
2018 Federated Conference on Computer Science and Information Systems (FedCSIS)
,
Poznan, Poland
,
Sept. 9–12
, pp.
907
911
.
16.
Li
,
B. H.
,
Hou
,
B. C.
,
Yu
,
W. T.
,
Lu
,
X. B.
, and
Yang
,
C. W.
,
2017
, “
Applications of Artificial Intelligence in Intelligent Manufacturing: A Review
,”
Front. Inf. Technol. Electron. Eng.
,
18
(
1
), pp.
86
96
.
17.
Fischer
,
G.
, and
Nakakoji
,
K.
,
1992
, “
Beyond the Macho Approach of Artificial Intelligence: Empower Human Designers—Do Not Replace Them
,”
Knowl. Based Syst.
,
5
(
1
), pp.
15
30
.
18.
Boden
,
M. A.
,
1998
, “
Creativity and Artificial Intelligence
,”
Artif. Intell.
,
103
(
1–2
), pp.
347
356
.
19.
Chan
,
F. T. S.
,
Jiang
,
B.
, and
Tang
,
N. K. H.
,
2000
, “
Development of Intelligent Decision Support Tools to Aid the Design of Flexible Manufacturing Systems
,”
Int. J. Prod. Econ.
,
65
(
1
), pp.
73
84
.
20.
Karan
,
E.
, and
Asadi
,
S.
,
2019
, “
Intelligent Designer: A Computational Approach to Automating Design of Windows in Buildings
,”
Autom. Constr.
,
102
, pp.
160
169
.
21.
Guzdial
,
M.
,
Liao
,
N.
,
Chen
,
J.
,
Chen
,
S.-Y.
,
Shah
,
S.
,
Shah
,
V.
,
Reno
,
J.
,
Smith
,
G.
, and
Riedl
,
M. O.
,
2019
, “
Friend, Collaborator, Student, Manager: How Design of an Ai-Driven Game Level Editor Affects Creators
,”
Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems
,
Glasgow, UK
,
May 4–9
.
22.
Feldman
,
S.
, “
Co-Creation: Human and AI Collaboration in Creative Expression
,”
Electronic Visualisation and the Arts
,
London, UK
,
July 11–13
, pp.
422
429
.
23.
Chattaraman
,
V.
,
Kwon
,
W. S.
,
Gilbert
,
J. E.
, and
Ross
,
K.
,
2019
, “
Should AI-Based, Conversational Digital Assistants Employ Social- or Task-Oriented Interaction Style? A Task-Competency and Reciprocity Perspective for Older Adults
,”
Comput. Hum. Behav.
,
90
, pp.
315
330
.
24.
Morley
,
J. E.
,
Morris
,
J. C.
,
Berg-Weger
,
M.
,
Borson
,
S.
,
Carpenter
,
B. D.
,
del Campo
,
N.
,
Dubois
,
B.
,
Fargo
,
K.
,
Fitten
,
L. J.
,
Flaherty
,
J. H.
,
Ganguli
,
M.
,
Grossberg
,
G. T.
,
Malmstrom
,
T. K.
,
Petersen
,
R. D.
,
Rodriguez
,
C.
,
Saykin
,
A. J.
,
Scheltens
,
P.
,
Tangalos
,
E. G.
,
Verghese
,
J.
,
Wilcock
,
G.
,
Winblad
,
B.
,
Woo
,
J.
, and
Vellas
,
B.
,
2015
, “
Brain Health: The Importance of Recognizing Cognitive Impairment: An IAGG Consensus Conference
,”
J. Am. Med. Dir. Assoc.
,
16
(
9
), pp.
731
739
.
25.
Sweller
,
J.
,
2016
,
Evolutionary Perspectives on Child Development and Education
,
D. C.
Geary
, and
D. B.
Berch
, eds.,
Springer International Publishing
,
Cham, Switzerland
, pp.
291
306
.
26.
Zhou
,
J.
,
Yu
,
K.
,
Chen
,
F.
,
Wang
,
Y.
, and
Arshad
,
S. Z.
,
2018
,
The Handbook of Multimodal-Multisensor Interfaces: Foundations, User Modeling, and Common Modality Combinations—Volume 2
,
S.
Oviatt
,
S.
Bjorn
,
P. R.
Cohen
,
D.
Sonntag
,
G.
Potamianos
, and
A.
Kruger
, eds.,
Association for Computing Machinery and Morgan & Claypool
,
New York City
, pp.
287
329
.
27.
Dykstra
,
J.
, and
Paul
,
C. L.
,
2018
, “
Cyber Operations Stress Survey (COSS): Studying Fatigue, Frustration, and Cognitive Workload in Cybersecurity Operations
,”
11th USENIX Workshop on Cyber Security Experimentation and Test (CSET)
,
Baltimore MD
,
August 13
.
28.
Fallahi
,
M.
,
Motamedzade
,
M.
,
Heidarimoghadam
,
R.
,
Soltanian
,
A. R.
, and
Miyake
,
S.
,
2016
, “
Effects of Mental Workload on Physiological and Subjective Responses During Traffic Density Monitoring: A Field Study
,”
Appl. Ergon.
,
52
, pp.
95
103
.
29.
Nolte
,
H.
, and
McComb
,
C.
,
2020
, “
Identifying Stress Signatures Across the Engineering Design Process: Perceived Stress During Concept Generation, Concept Selection, and Prototyping
,”
Proceedings of the Design Society: Design Conference
,
Cavtat, Croatia
,
October 26-29
.
30.
Nolte
,
H.
, and
McComb
,
C.
,
2021
, “
The Cognitive Experience of Engineering Design: An Examination of First-Year Student Stress Across Principal Activities of the Engineering Design Process
,”
Des. Sci.
,
7
, p.
e3
.
31.
Lake
,
B. M.
,
Ullman
,
T. D.
,
Tenenbaum
,
J. B.
, and
Gershman
,
S. J.
,
2017
, “
Building Machines That Learn and Think Like People
,”
Behav. Brain Sci.
,
40
, p.
E253
.
32.
Dellermann
,
D.
,
Ebel
,
P.
,
Söllner
,
M.
, and
Leimeister
,
J. M.
,
2019
, “
Hybrid Intelligence
,”
Bus. Inf. Syst. Eng.
,
61
(
5
), pp.
637
643
.
33.
VanLehn
,
K.
,
Burkhardt
,
H.
,
Cheema
,
S.
,
Kang
,
S.
,
Pead
,
D.
,
Schoenfeld
,
A.
, and
Wetzel
,
J.
,
2019
, “
Can an Orchestration System Increase Collaborative, Productive Struggle in Teaching-by-Eliciting Classrooms?
Interact. Learn. Environ.
34.
Liew
,
C.
,
2018
, “
The Future of Radiology Augmented With Artificial Intelligence: A Strategy for Success
,”
Eur. J. Radiol.
,
102
, pp.
152
156
.
35.
Shah
,
J. J.
,
Vargas-Hernandez
,
N.
, and
Smith
,
S. M.
,
2003
, “
Metrics for Measuring Ideation Effectiveness
,”
Des. Stud.
,
24
(
2
), pp.
111
134
.
36.
Nguyen
,
L.
, and
Shanks
,
G.
,
2009
, “
A Framework for Understanding Creativity in Requirements Engineering
,”
Inf. Softw. Technol.
,
51
(
3
), pp.
655
662
.
37.
Bashir
,
H. A.
, and
Thomson
,
V.
,
1999
, “
Metrics for Design Projects: A Review
,”
Des. Stud.
,
20
(
3
), pp.
263
277
.
38.
Bashir
,
H. A.
, and
Thomson
,
V.
,
2001
, “
Models for Estimating Design Effort and Time
,”
Des. Stud.
,
22
(
2
), pp.
141
155
.
39.
Song
,
B.
,
Gyory
,
J. T.
,
Soria Zurita
,
N. F.
,
Stump
,
G. M.
,
Martin
,
J. D.
,
Miller
,
S. W.
,
Balon
,
C. M.
,
Yukish
,
M. A.
,
McComb
,
C.
, and
Cagan
,
J.
,
2021
, “
Decoding the Agility of Human-Artificial Intelligence Hybrid Teams in Complex Problem Solving
,”
Des. Stud.
40.
Stump
,
G. M.
,
Miller
,
S. W.
,
Yukish
,
M. A.
,
Simpson
,
T. W.
, and
Tucker
,
C.
,
2019
, “
Spatial Grammar-Based Recurrent Neural Network for Design Form and Behavior Optimization
,”
ASME J. Mech. Des.
,
141
(
12
), p.
124501
.
41.
Hart
,
S. G.
, and
Staveland
,
L. E.
,
1988
, “
Development of NASA-TLX (Task Load Index): Results of Emperical and Theoretical Research
,”
Adv. Psychol.
,
52
(
C
), pp.
139
183
.
42.
Hart
,
S. G.
,
2006
, “
Nasa-Task Load Index (NASA-TLX); 20 Years Later
,”
Proceedings of the Human Factors and Ergonomics Society Annual Meeting
,
Los Angeles, CA
,
October
.
43.
Zhang
,
G.
,
Soria Zurita
,
N. F.
,
Stump
,
G.
,
Song
,
B.
,
Cagan
,
J.
, and
McComb
,
C.
,
2021
, “
Data on the Design and Operation of Drones by Both Individuals and Teams
,”
Data Br.
,
36
, p.
107008
.
44.
McComb
,
C.
,
Cagan
,
J.
, and
Kotovsky
,
K.
,
2017
, “
Mining Process Heuristics From Designer Action Data Via Hidden Markov Models
,”
ASME J. Mech. Des.
,
139
(
11
), p.
111412
.
45.
Baum
,
L. E.
,
Petrie
,
T.
,
Soules
,
G.
, and
Weiss
,
N.
,
1970
, “
A Maximization Technique Occurring in the Statistical Analysis of Probabilistic Functions of Markov Chains
,”
Ann. Math. Stat.
,
41
(
1
), pp.
164
171
.
46.
Durbin
,
R.
,
Eddy
,
S. R.
,
Krogh
,
A.
, and
Mitchison
,
G.
,
1998
,
Biological Sequence Analysis: Probabilistic Models of Proteins and Nucleic Acids
,
Cambridge University Press
,
Cambridge, UK
.
47.
Mehta
,
P.
,
Malviya
,
M.
,
McComb
,
C.
,
Manogharan
,
G.
, and
Berdanier
,
C. G. P.
,
2020
, “
Mining Design Heuristics for Additive Manufacturing Via Eye-Tracking Methods and Hidden Markov Modeling
,”
ASME J. Mech. Des.
,
142
(
12
), p.
124502
.
48.
Maier
,
T.
,
Zurita
,
N. F. S.
,
Starkey
,
E.
,
Spillane
,
D.
,
Menold
,
J.
, and
McComb
,
C.
,
2020
, “
Analyzing the Characteristics of Cognitive-Assistant-Facilitated Ideation Groups
,”
Proceedings of the ASME Design Engineering Technical Conference
,
Online, virtual
,
Aug. 17–19
, p.
V008T08A046
.
49.
Goucher-Lambert
,
K.
, and
McComb
,
C.
,
2019
, “
Using Hidden Markov Models to Uncover Underlying States in Neuroimaging Data for a Design Ideation Task
,”
Proceedings of the International Conference on Engineering Design, ICED
,
Delft, The Netherlands
,
August 5-8
.
50.
Mahan
,
T.
,
Meisel
,
N.
,
McComb
,
C.
, and
Menold
,
J.
,
2019
, “
Pulling at the Digital Thread: Exploring the Tolerance Stack Up Between Automatic Procedures and Expert Strategies in Scan to Print Processes
,”
ASME J. Mech. Des.
,
141
(
2
), p.
021701
.
51.
Di Luca
,
M.
,
Mintchev
,
S.
,
Su
,
Y.
,
Shaw
,
E.
, and
Breuer
,
K.
,
2020
, “
A Bioinspired Separated Flow Wing Provides Turbulence Resilience and Aerodynamic Efficiency for Miniature Drones
,”
Sci. Robot.
,
5
(
38
), p.
8533
.
52.
Brachten
,
F.
,
Brünker
,
F.
,
Frick
,
N. R. J.
,
Ross
,
B.
, and
Stieglitz
,
S.
,
2020
, “
On the Ability of Virtual Agents to Decrease Cognitive Load: An Experimental Study
,”
Inf. Syst. e-Bus. Manage.
,
18
(
2
), pp.
187
207
.
53.
de Melo
,
C. M.
,
Kim
,
K.
,
Norouzi
,
N.
,
Bruder
,
G.
, and
Welch
,
G.
,
2020
, “
Reducing Cognitive Load and Improving Warfighter Problem Solving With Intelligent Virtual Assistants
,”
Front. Psychol.
,
11
(
554706
), pp.
1
12
.
54.
Maier
,
T.
,
Donghia
,
V.
,
Chen
,
C.
,
Menold
,
J.
, and
McComb
,
C.
,
2019
, “
Assessing the Impact of Cognitive Assistants on Mental Workload in Simple Tasks
,”
Proceedings of the ASME Design Engineering Technical Conference
,
Anaheim, CA
,
Aug. 18–21
, p.
V007T06A021
.
55.
Maier
,
T.
,
Abdullah
,
S.
,
McComb
,
C.
, and
Menold
,
J.
,
2021
, “
A Query Conundrum: The Mental Challenges of Using a Cognitive Assistant
,”
SN Comput. Sci.
,
2
(
3
), p.
194
.