Atom Tracker: Designing a Mobile Augmented Reality Experience to Support Instruction about Cycles and Conservation of Matter in Outdoor Learning Environments Amy M. Kamarainen, Shari Metcalf, Tina A. Grotzer, Craig Brimhall and Chris Dede. Amy Kamarainen is an ecosystem ecologist and senior research manager at the Harvard Graduate School of Education (HGSE). Her work focuses on the design and evaluation of technologies to support science learning both in and outside of the classroom. Shari Metcalf is the director of the EcoXPT and EcoMOBILE projects at HGSE. She has been working as a researcher in educational technology for two decades, primarily in middle and high school science. Tina Grotzer is an Associate Professor of Education at HGSE and a Principal Investigator at Project Zero. She directs the Understandings of Consequence: Causal Cognition in a Complex World Lab (UC Lab). She received a Presidential Early Career Award in 2011. She taught in K-12 schools for 14 years. Craig Brimhall completed a master’s degree at HGSE and is exploring ways to increase educational opportunities for marginalized communities through the design of instruction and learning spaces. Chris Dede is the Timothy E. Wirth Professor of Learning Technologies at HGSE. His fields of scholarship include emerging technologies, policy, and leadership. He has served on the National Academy of Sciences Committee on Foundations of Educational and Psychological Assessment and the Technical Working Group for the US Department of Education’s 2010 National Educational Technology Plan. Abstract We describe a mobile augmented reality (AR) experience called Atom Tracker designed to help middle school students better understand the cycling of matter in ecosystems with a focus on the concept of conservation of matter and the processes of photosynthesis and respiration. Location-based AR allows students to locate virtual “hotspots,” where they interact with multiple representations including vision-based AR animations of virtual atoms during ecological processes such as photosynthesis and physical LEGO® -based representations of molecules. This design case describes the design rationale, the iterative design process, the context for implementation, and reflections on the success and limitations of the Atom Tracker AR experience. An augmented reality interface was chosen due to theoretical support for its utility in supporting interaction with multiple representations (both physical and virtual) of atoms and molecules, the ability to condense and expand temporal and spatial scales associated with ecological processes, and its ability to explicitly situate these representations in real world contexts that could support learning. Two significant design challenges that we recognized were 1) appropriately leveraging narrative, student engagement and agency when designing around the topic of atoms and molecules, which are inanimate and invisible; and 2) designing for engagement with both virtual and physical resources available during the experience. EcoMUVE as a Precursor The idea for Atom Tracker arose during the ecological multi-user virtual environment (EcoMUVE) project, which is a 10-day middle school science curriculum that supports learning about causal patterns in ecosystems. It consists of two ecosystem modules, Pond and Forest, each centered on an immersive virtual environment that represents a complex causal scenario. In the Pond module, students use an avatar to explore a pond and its surroundings (Figure 1) over eight virtual summer days and discover that, late in the summer, all of the large fish have died. By exploring, observing, and collecting data, students unravel the complicated connections among distributed sources of fertilizer, an ensuing algal bloom, and eventual life-threatening oxygen depletion following excessive decomposition. The design team conceived of a virtual Atom Tracker tool to reveal otherwise invisible molecular processes (like decomposition) that play a role in the fish kill scenario. Figure 1. Screenshot of an avatar in the EcoMUVE Pond immersive environment. We recognized that Atom Tracker could provide an important emphasis on the connection between small-scale processes and larger scale phenomena - an important and hard to reach causal understanding goal (Grotzer 2012), and would reinforce learning goals related to elemental cycling and conservation of matter. We focused on depicting the cycles of three atoms that would highlight ecological processes most important to the Pond scenario – oxygen, phosphorus, and carbon. Oxygen is a critical element in the story, yet it is not often the primary focus of student attention. By emphasizing oxygen in Atom Tracker, we provided students an additional view of the problem space that supports connections between the details of photosynthesis and respiration and the larger scale effect of oxygen depletion in the pond. Tracking the carbon atom helps to reinforce the goals of understanding photosynthesis and respiration. The phosphorus atom helps students notice that fertilizer applied to the land surrounding the pond eventually makes its way into the pond, illustrating ways in which the pond is connected with events in the larger surrounding watershed. Atom Tracker - originally called “Adopt-an-Atom” - began with an idea of “adopting” an atom to study, which gave rise to designs focused on communication and visualization of the adopted atom, including customization of its appearance (Figure 2). After discussion among the design team, we realized fashioning your own atom may overshadow the learning goals, as personalized atom colors may obscure the idea that the atom that had been adopted was “one-of-many,” not unique. This also brought to light a discussion around the role of anthropomorphism in engagement and learning. To resolve disagreements among the design team about the relative pros and cons of using anthropomorphic elements in our design, we turned to the literature to better understand the potential implications. We found that anthropomorphic language is commonly used in science classrooms and has been shown to provide communicative value for teachers and students when it is used intentionally by teachers (Taber & Watts 1996), yet the positive use of anthropomorphic language may be mediated by age and teacher experience, where older students are able to distinguish between anthropomorphic descriptions and anthropomorphic reasoning in science explanations (Tamir and Zohar 1991; Kallery & Psillos 2004). While recommendations in the research are equivocal, we decided to use some degree of anthropomorphization in ways that support engagement and support practical interpretation of the processes depicted. The anthropomorphic characteristics (i.e., facial features) served to distinguish the atom of interest among the other atoms and molecules in the images, without needing to use other visual indicators like different colors or size. Anthropomorphic language may help students see processes from the perspective of the atom, and students may be more engaged by following an atom character, who could describe what had just happened during the physical and chemical transformations. We thought students would be able to recognize that atoms don’t really talk, and we used passive language that communicated the atom of interest was “along for the ride”, rather than expressing any agency, choice or intention regarding the pathway and processes in which it took part. Throughout the design of the Atom Tracker experience the team acknowledged and revisited the tension between representing processes from an atom’s perspective (viewed as providing higher engagement value) vs. presenting an inanimate view of molecular processes (viewed as providing a more scientifically accurate portrayal of ecosystem processes). Figure 2. Mock-up of a “design your atom” activity proposed during early design work. Communication between the atom and the user evolved from a visible atom character writing “blog posts” (Figure 3), to the notion of students being able to turn on an atom tracking tool that would reveal the location of an invisible atom. Students then search the virtual space to find an atom indicator, and click on this indicator to see what their atom has been doing (Figure 4). In this case the agency was shifted from the atom to the user; instead of the atom being a central character in the virtual world, the user becomes the driver of the narrative by finding the atom indicators and piecing together the story over time. Each indicator revealed a situated view of the atom as a part of a larger molecule, with the “tagged” atom indicated by a smiley face. The final form was a comic strip-style storybook, where the user finds each page of the narrative by visiting different dates and locations in the virtual world (Figure 4). Figure 3. Early prototype of the Atom Tracker integrated with the EcoMUVE Pond environment with a “blog” based communication tool. Figure 4. Final design for the EcoMUVE Atom Tracker tool with a billboard and arrow indicating presence of a track-able atom, and a comic strip illustration of the activity of the atom that appears when a student clicks on the billboard. To see a complete story for an atom, students have to find the atom signposts on each of eight virtual days. While the virtual world provided compelling mechanisms for embedding molecular processes in realistic contexts, and for revealing these hidden processes to students in fun ways, this disjointed tracking across virtual time made it difficult for the students to synthesize a complete elemental cycle. We therefore created paper-based worksheets to accompany the Atom Tracker activity on which students could document the location, type of molecule, and important details from the atom’s message (Figure 5), and conceptualize the big picture. The design team began to consider whether there might be better ways to harness the potential value of learning about ecosystem processes in real-world contexts in which they occur. Figure 5. Worksheet used in conjunction with the EcoMUVE Atom Tracker activity during classroom implementation of the EcoMUVE curriculum. context and goals for Augmented Reality Version of Atom Tracker When the design team received funding for EcoMOBILE, a project to explore mobile technology enabled ecosystem science learning, we identified Atom Tracker as a potential activity - considering that this different technology format might better support learning about material cycles that are distributed in nature and occur across time and space scales that are difficult to experience. Augmented reality (AR) is an immersive interface using mobile, context-aware technologies and software that enables participants to interact with digital information embedded within a physical environment, and offers affordances to foster understanding of difficult science concepts through situated visualization of otherwise unobservable processes. There are two forms of augmented reality commonly referred to as “location-based” and “vision-based” AR (Dunleavy & Dede 2013); each of which has potential benefits for learning about molecular processes in real ecosystems. The FreshAiR augmented reality platform (playfreshair.com) allows the integration of location-based and vision-based AR elements within a learning experience. One of the lead members of the design team, Chris Dede, had previously worked on an augmented reality-focused research project (the Handheld Augmented Reality Project – HARP), so he had had first-hand experience with the challenges and benefits of designing AR experiences (O’Shea, Mitchell, Johnston & Dede 2009). Based on this experience and past collaboration with colleagues who were developing an augmented reality game platform (FreshAiR, playfreshair.com), the design team decided to further explore the utility of AR for designing an outdoor version of Atom Tracker. As we began envisioning what a mobile augmented reality version of Atom Tracker might look like, we developed a few design themes based on known challenges to learning about molecular processes and on our hypothesized affordances of using mobile augmented reality. Multiple representations - A range of digital visualization and animation tools have recently emerged allowing students to view, interpret, and interact with molecular phenomena at various scales using 2D and 3D models, and relatively sophisticated simulations (Snir, Smith & Raz, 2002; Stern, Barnea & Shauli 2008; Ozmen 2010; Pallant & Tinker, 2004; Xie & Tinker, 2006). Research shows that there is value in providing opportunities to view multiple forms of representation (Wu & Shah 2004; Ainsworth 1999), in manipulating and interacting with physical models and visual representations, supporting metacognition and reflection related to the visualizations (Wu & Shah; Chang et al. 2009), and making links among these representations visible (Wu & Shah 2004). We recognized that the AR platform would provide new modes of interaction with multiple representations of the molecular structures and processes of interest because students would be able to view images, videos, and interactive 3D models within the AR experience and these digital representations could be combined with interaction with physical objects like LEGO-based molecular models or trees. Bridging Scales – Researchers and practitioners have identified an enduring challenge of helping students connect the macroscopic, observable properties of matter, and the submicroscopic behavior of atoms and molecules that results in these properties (Talanquer 2010; Smith et al. 2004). Using digital visualization and animations allows us to virtually speed up or slow down the rate of ecological processes and to magnify atomic and molecular components, thereby revealing these processes and components as visible and observable by the user. The FreshAiR AR platform allowed us to embed three-dimensional models and animations into the user experience using vision-based AR built upon the VuforiaTM (Qualcomm, Inc.) software platform. Visualizations were triggered using an optical label that can be visually recognized by the camera application on a mobile broadband device. These labels function similarly to quick response (QR) codes, but in this case are images of natural elements one might find in an outdoor environment (Figure 6). These paper-based triggers could be affixed to physical objects and unlock a virtual view of the molecular make-up of the physical object. Additionally, the FreshAiR and Vuforia technology provided a proximity functionality, such that different views of the phenomenon could be activated when a student held their mobile device at varying distances from the label. This proximity functionality inspired us to think about ways that students could “zoom in” to see molecular processes manifest at different scales – providing students with interactive “x-ray” molecular vision to see the atoms and molecules within physical objects in the environment. Thus, we identified a number of ways the AR functionality could support students bridging understanding across scales. Figure 6. Vision-based augmented reality label that functions like a quick-response code to trigger information on the visual display of a mobile device. Situated within the Real World - The biological processes of photosynthesis and respiration are happening all around us, yet teaching these concepts often involves abstracted representations (chemical symbols and equations) that are not often linked with real world phenomena (Smith, Wiser, Anderson, Krajcik, Coppola 2004; Stavridou & Solomonidou 1998; Stern & Roseman 2003). Contextualized triggering of information and visualizations was enabled by the mobile technology using location-based AR functionality. The AR editing platform provides a Google Mapsbased drag-and-drop feature that designates the coordinates of a location-based trigger. When students visit the real location, the client-side FreshAiR mobile app displays a virtual marker, label, compass and approximate distance to a nearby location-based trigger (Figure 7). Students use the visual display, compass and distance indicators to navigate to the virtual marker. When they arrive (within a radius of approximately 20 feet of the coordinates indicated in the FreshAiR editor due to inherent variability in the precision of GPS), location-specific information is displayed on the screen. Multiple location-based triggers can be included in the design of the experience, and the FreshAiR editor settings allow the designer to turn these virtual markers on/off according to desired design and sequence of the experience. This allows the designer to guide the user to specific locations in a physical space, and guide the user through the instructional elements of the experience. Thus, mobile AR technology can be used to “activate” views of molecular processes when students encounter meaningful objects or locations in the real world. Figure 7. The visual display of a location-based hotspot on the augmented reality client-side application using a heads-up display. Taken together, these design themes describe what motivated us to use augmented reality (AR) as a way to expose students to elemental cycling in real ecosystems, and we used these themes to guide our design. Below we describe the design of an AR enabled version of Atom Tracker that integrates these design themes with student interaction in an outdoor environment in an attempt to connect understanding of ecological processes to real-world experiences. The design team The primary design team was comprised of the Principle Investigators on the EcoMOBILE grant (funded by the Qualcomm Wireless Reach Initiative and the National Science Foundation grant no. 1118530), Chris Dede and Tina Grotzer. Chris Dede in an expert in educational technology and learning and had prior experience conceptualizing and designing mobile and augmented reality learning experiences. Tina Grotzer is an expert in causal cognition and has extensive experience designing classroom curricula focused on helping students understand complex relationships in ecosystems science. Shari Metcalf is an expert in computer interface design and has extensive experience in designing various technologies to support science learning. Amy Kamarainen is an expert in ecosystem science and has experience in designing experiential outdoor learning activities. This core design team worked closely with a single science teacher from a nearby middle school, who reviewed design components and contributed ideas based on previous teaching materials used in her classroom. Further, we involved the developers of the augmented reality platform FreshAiR (playfreshair.com), who had worked with Dede on previous AR projects. The FreshAiR team worked closely with us during early development of the design of augmented reality components, contributed ideas about the experience design, and provided insight into features and functionality of the AR platform, particularly in how to design elements that would take advantage of the proximity-drive visionbased augmented reality functionality (Figure 8). Meanwhile, we provided feedback to their team about the functionality of the software in the field, and requested new features to support usability of the client-side application. The FreshAiR AR platform allowed us to embed three-dimensional models and animations into the user experience using vision-based AR built upon the VuforiaTM (Qualcomm, Inc.) software platform. This emerging technology was actively under development by Qualcomm, and we communicated with members of the Qualcomm development team to ask questions and gain support for use of this new software. We worked with teams of graphic designers and digital artists (Wisdom Tools, Blot Media, and PublicVR), who were accustomed to working on science and education related projects, to create the final illustrations, visualizations, and animations used in the experience. Finally, we periodically involved graduate students at the Harvard Graduate School of Education, who contributed expertise in education technology, science education, classroom instruction, curriculum design, technology design, and game design. Figure 8. Storyboard for the vision-based augmented reality functionality that incorporates 3D models and animations with proximity-based transitions between visual displays. (used with permission from MoGo Mobile, Inc.) The design of a mobile augmented reality version of Atom Tracker brought to the surface a number of design tensions, which we explored through the design and testing of two different versions of the mobile AR experience. These versions highlight tensions between designing for 1.) support of student engagement and interpretation at the same time as providing authentic portrayal of realistic molecular behavior and 2.) integrating both physical and virtual interactions during outdoor experiences. Atoms and molecules have no agency – they are passive, nonliving units of matter and the transformations and reaction in which they partake happen due to probabilistic interactions with other molecules. Meanwhile, the use of mobile technology can support student choice and agency, which can be engaging aspects of a mobile AR experience, and the AR platform offers mechanisms for embedding game-based and narrative elements into the experience. Thus, we found ourselves considering the tradeoffs between engagement and realism throughout the design process. Also, use of the mobile AR platform unlocked the potential to embed virtual elements at any time or place during the experience. We found it important to consider how to best facilitate the interaction between virtual and physical elements throughout the design. Each design iteration resides at an intersection of the push and pull between the engagement vs. realism and physical vs. virtual perspectives, and we point to these tradeoffs throughout the design descriptions below. EcoMOBILE Atom Tracker Version 1.0 – a focus on the carbon cycle During development of the first version of the mobile Atom Tracker, we used a design heuristic of thinking about what Atom Tracker might look like if it were a virtual tool used to track carbon by ecosystem scientists in the real world. This led us to adopt a conceptual framework often used by ecosystem scientists called a stock and flow model. A stock and flow model is a way to represent a system that helps one to trace a material through the system. The material may temporarily reside in “stocks” (locations) and the change from one stock to another is called the “flow” – a stock and flow model can provide an accounting of where and how much material can be found in the various stocks at any given point in time. We used a simplified conceptual stock-and-flow model to think about significant carbon-based components within ecosystems (the stocks, represented by boxes) and the processes or events that lead to a transition between components (the flows, represented by arrows) (Figure 9). This conceptual framework emphasized the many potential pathways from one component to another, as opposed to a more linear storyline that was portrayed in the EcoMUVE version of Atom Tracker. Also, the idea of the “stocks” aligned well with location-based AR triggers, while the “flows” become the impetus for movement among locations. Figure 9. Image of white board displaying a simple “stock-and-flow” depiction of the components and processes to be represented in Atom Tracker. We further developed this conceptual framework as a map of the potential locationbased triggers that would be integrated into the game, where each box represents a location (Figure 10). Thus the design framework underlying version 1.0 – a stockand-flow model - was strongly tied to abstract representations real scientists use to understand this system. With this initial framework in mind, we considered how users might be motivated to move from one location to another and learn along the way. We used a gamemechanism of giving students choices at each location about where to go next – thus the “flows” become opportunities for the students to make a choice and take an action. At any particular location, they would be given 2 choices; one of which may be incorrect. For example, when at the atmosphere location students may be given the choice of next visiting the plant/producer location or dead matter location - it would make little sense to choose to go from the atmosphere directly to dead matter. Students would decide where to go based on their knowledge of the carbon cycle, and if they had a wrong idea, they might arrive at a “dead end”. Students might complete multiple cycles participate in multiple processes and eventually visit all stocks/locations. We envisioned that users would learn about and experience the cyclic nature of carbon movement through repeated visits to some of the locations. As the activity would be cyclic in nature, we thought students may need some way to document the choices they had made, and because FreshAiR platform did not have a mechanism for this, we developed the location map into a worksheet that students could fill in as they completed the Atom Tracker activity (Figure 10). The worksheet was intended to serve several purposes. First, the names of the “stocks” were given, but the students would need to fill in the name of the process that caused them to transition from one location to the next, so it would help them pay attention to the processes (e.g., photosynthesis) responsible for the transitions. Second, it could be a map for students, which would guide them through the process of the game. Third, it serves as a place to record the information to limit cognitive load, but retain the information for later use and consideration. We decided, however, not to give this version of the worksheet to students, as we wanted to capitalize on student engagement, and one of our teammates commented that the worksheet seemed too much like school. Instead, we brainstormed ideas to make the experience more like a game, such as making it a maze, a treasure hunt, or including puzzle pieces. We eventually decided to go with the puzzle idea. Figure 10. Map of mobile AR location-based triggers, which also served as a mock-up of a worksheet proposed for use during the EcoMOBILE Atom Tracker activity. While we didn’t use the worksheet with the students, it served as a design guide for developing the content to be presented in the puzzle pieces. We decided the students would pick up a physical puzzle piece at each hotspot, and each piece would have blanks on it to be filled in by the student (Figure 11). After students collected all pieces, they would meet their teammates to figure out the connections among these pieces. We decided to have component puzzle pieces (to represent the stocks) and also “connection cards”, which would be filled in to draw the flows between the components, like this: “____________ is connected to _______________through the process of ________________”. It serves as a hint for students to make a connection between certain locations/puzzle pieces. Thus, each virtual location was associated with the collection of a physical puzzle piece that represented a relevant stock or flow in the system. Figure 11. Paper-based cards illustrating Atom Tracker components used during EcoMOBILE Atom Tracker V.1.0 – each piece was placed at one of the locations and students picked them up as they visited each location. We pilot tested the usability of this design with 4 students during a ~1.5 hour period after school. Not knowing much about the physical context for the activity before carrying it out, we had designed all of the hotspots to be virtual locations, meaning that the “producers” hotspot was not necessarily connected with a physical producer in the real world. The hotspots were scattered within the space of the sports field behind the school. The four students played through the experience for about 45 minutes, and then two students (the other two students had to leave early) spent an additional 30 minutes building connections based on the puzzle pieces they had collected. We received feedback from the participants through observations during the AR experience, and during the process of constructing the puzzle diagram afterwards. Feedback from this small group indicated that they enjoyed the act of finding the location-based triggers, but later became frustrated by having to go back to locations they had already visited that were fairly far away. These students recognized that the “atmosphere” location was purely virtual, and they complained about having to go back “all the way over there” to get the next puzzle piece. Our lack of knowledge about the landscape meant that some of the locations had been placed on either side of a chain link fence, which only had one entrance that was not located near the actual destination, so navigating to this location was frustrating to students. Pilot testing this version and encountering the problem with the fence made it apparent how important it is to be aware of the “lay of the land” when designing the activity and placing the hotspots. Student feedback also made us reflect on how we could better use the local physical context as an asset in design, rather than making all of the hotspots purely “virtual”. We identified a number of challenges that arose in trying to embed the cyclic nature of carbon movement as a dominant game mechanism. While we allowed the students to make decisions about which hotspot to visit next, the students didn’t necessarily devote much attention to these decisions, and thus the decision did not necessarily reflect their understanding of which choice was most sensible. They seemed to choose an option arbitrarily in order to see what would happen next. Thus, the decision-making mechanism was not strongly tied to outcomes that were meaningful to the students. Further, the design of this cyclic narrative was difficult to orchestrate within the augmented reality design platform because it required an accounting system for how many times a student had visited a particular location, and also required branch points based on how many times a student had visited a particular location. For example, when a student first visited the “atmosphere” they might be presented with the option to go through the process of photosynthesis or respiration, but on a second visit to the same location we presented students with the options of diffusion or photosynthesis. Building such an experience with multiple dependencies became extremely cumbersome, and we observed a few bugs in the field that arose due to programming errors that had not been caught during testing. While a “fossil fuel” stock was included in our initial design, we didn’t incorporate this into the version that was pilot tested because it was hard to present a process that occurs at a much different time scale (thousands of years for fossil fuel to form) within the same experience the was representing processes, like photosynthesis and respiration, that occur over much shorter time scales. However, when the teacher mentioned how valuable this long-term perspective was, we realized the importance of including this hotspot, and so were sure to include it in version two of the design. After the experience, the students were able to engage in a fruitful group discussion to construct a physical representation of the connections among the puzzle pieces (Figure 12). This final display echoed the original worksheet we had outlined, so we felt we had created an experience that allowed the students to construct the same information in a way that was more interactive and student-driven than a worksheet would have been. While picking up the fill-in-the-blank clues at each hotspot was an engaging aspect of the experience, the design team realized this paper-based activity might not be scalable, as the teacher or organizer of the activity would need to print, laminate, and hide the clues at the various locations ahead of the activity. However, this paper-based pilot helped us think about other ways we could more deeply leverage the connections between virtual aspects of the experience and physical features of the environment. Figure 12. Web constructed by students following their use of Atom Tracker V.1.0 EcoMOBILE Atom Tracker Version 2.0 Design Between version 1.0 and 2.0, the core design team experienced a turn-over in the cohort of Master’s students who were working with us on the project. This change in team composition led to new perspectives and ideas on how to design the mobile AR Atom Tracker experience, and we revisited the tradeoffs between engagement and realism, and between virtual and physical. Following group reflection on the version 1.0 experience, we re-assessed our learning goals and re-focused on design elements that most strongly take advantage of the particular combination of technology and the outdoor, physical context with which we were working. This led us to focus on 1) better integrating engaging elements with learning goals through mechanisms like real-time feedback to embedded questions, multiple levels that align with meaningful changes in the molecular structure and processes, and collaboration through a jigsaw pedagogy 2) bridging among representations by embedding virtual animations that are explicitly tied to real physical objects. Students learning about the processes of photosynthesis and respiration often struggle to truly understand the relationship between non-living elements and living organisms (Lin & Hu, 2003). We recognized an area of opportunity to use the vision-based AR to provide students with virtual representations of the processes of photosynthesis and respiration “happening” within physically present living organisms. Further, carbon and oxygen were chosen because both are necessary for the processes of photosynthesis and respiration; therefore, we could use the two elements to design a complementary, jigsaw pedagogy in which students who tracked carbon would meet up with students who tracked oxygen to witness how the cycles of the two elements interact. Thus, we decided to focus on learning goals related to the conservation of matter and the processes of photosynthesis and respiration. With these learning goals in mind, we decided to use the AR experience to demonstrate the principle of conservation of matter and to orchestrate intersecting pathways of interaction between carbon and oxygen, which would provide opportunities for students to interact and collaborate. As this goal of collaboration depended on students arriving at a common location at a similar time, we moved away from the idea of allowing students to make their own choices (used during Atom Tracker Version 1.0), and instead to use a fixed storyline for each atom that would emphasize movement and transformation. Engagement through Real-Time Feedback The AR platform provided a mechanism by which we could provide interactive feedback to students based on their responses to embedded questions in the experience. Rather than a static worksheet that would be graded by a teacher afterthe-fact, the embedded questions allow us to deliver real-time feedback on student understanding while they were immersed in the augmented reality experience. Embedded questions were used as prompts for reflection and to help students identify their own misconceptions (Figure 13). These elements were designed to engage students, and also reinforce learning goals. We used the questions as “gate keepers” – the students needed to answer the questions appropriately before they could move on to the next part of the experience. This mechanism was used as a challenge for students, to provide pacing, and to help reinforce the value of the information that was provided in the written text of the experience. The questions focused on ideas around conservation of matter, and if a student answered a question incorrectly, the AR software delivered feedback that would help to clarify the concept. Figure 13. An example of an embedded multiple-choice question presented to students during Atom Tracker Oxygen Level 1. Feedback was delivered based on response to this question. Engagement through Multiple Levels and Collaboration We also chose to use multiple levels within the experience (Figure 14). This was done for a number of reasons 1) to support engagement by marking and celebrating students’ understanding and accomplishments at the end of each level, 2) to provide a natural break between conceptually distinct portions of the experience and 3) as a “safety valve” for the experience in case there were technical difficulties encountered on any specific level. Figure 14. Overview of the sequence of movements and transformations during level 1 and 2 of the oxygen and carbon experiences during Atom Tracker V.2.0. The red, white and grey molecular representations show which molecule the atom was in during each part of the experience, the labels between molecules show which process led to a transition between molecules, and the labels to the sides show the location of the molecule when the students observed them. The levels represented different levels of difficulty in the concepts students would interact with. During oxygen level 1, students would track their oxygen atom as part of a water molecule through a water cycle, thus only experiencing physical movement and phase changes (Figure 14). During level 2, the water molecule would undergo a chemical transformation as it is used during the process of photosynthesis, students collaborate to become part of an oxygen molecule, and later undergo respiration. During carbon level 1, students track their atom through the chemical transformations during photosynthesis and respiration. In level 2, carbon trackers are “transported” to a different era, 62 million years ago, in order to track what might have happened to their carbon atom if it had been buried in pond mud during the age of the dinosaurs and now is available as fossil fuel (Figure 14). Thus, carbon level 2 is more complicated because it considers atoms and molecules over a long time scale. Finally, we designed a level 3 activity that involved the students who had played the carbon tracker coming together with students who had played the oxygen tracker in order to simultaneously participate in the process of photosynthesis, as at the end of level 2 the oxygen player should have their atom as part of a water molecule, while the carbon tracker should have a carbon dioxide molecule. Interaction with Virtual and Physical Elements In version 1.0, all of the locations were virtual, and feedback from the students helped us realize that we had missed an opportunity for students to more deeply notice and engage with physical elements of the natural environment while playing Atom Tracker. We realized we could take greater advantage of the fact that the atmosphere is truly present, things all around us are made of carbon, and students have daily experience with materials made of oxygen and carbon. For version 2.0 of Atom Tracker, we thought deeply about the relationship between the physical environment and the ways that the technology could support student interaction with the environment and with the concepts. Movement of molecules was communicated using the location-based AR triggers to embed views of atoms and molecules in different areas of the outdoor space (Figure 15). As students moved through the learning activity, they would also move through physical space – for example, they could track the movement of their atom from the edge of a pond to a nearby tree (Figure 14). At each location the content embedded in the mobile AR experience encouraged students to interact with and observe characteristics of the real environment. For example, scooping up water from the pond, scanning visionbased triggers posted on trees (Figure 16), or observing the weather on the day of the field trip. Figure 15. Screenshot from the FreshAiR editor showing the location-based hotspot triggers in the original design near the pond. Figure 16. A vision-based AR trigger placed on the trunk of a living tree. Another way in which we decided to emphasize the connection between the virtual experience and physical elements was inspired by our conversations with the science teacher whose class was to participate in the Atom Tracker field trip. The teacher had typically helped students think about atoms, molecules, photosynthesis and respiration by asking them to construct LEGO® models of the molecules involved in these processes. We adapted this classroom activity to be conducted in the field, such that, as students visited the location-based hotspots, they would find LEGO® pieces at the target hotspot and could construct the molecule using a representation of the atom they were tracking as one of the LEGO® pieces. Our design vision was that this LEGO® piece would serve as a persistent physical representation that would remind the student which kind of molecule their atom was a part of, and would serve to reinforce the idea that atoms get rearranged into different molecules during the various movements and transformations the students would be tracking. Thus, the combination of physical LEGO pieces, virtual animations of molecular processes, and interaction with macro-scale physical objects like trees meant that students had opportunities to interact with multiple representations of the atoms and molecules across different scales and different forms of abstraction. Atom Tracker v2.0 Design Walkthrough The field trip was designed so some students would experience the carbon tracker and some students would experience the oxygen tracker. Here we describe the oxygen path in detail, followed by some examples from the carbon path. Level 0 – Random Movement First, to split the students into oxygen and carbon groups, we created an activity called “random” that served as the starting point of the field trip. The random activity was designed for two purposes 1) it helped to reinforce a learning goal from middle school science curriculum that the movement of molecules happens randomly 2) it helped to randomly sort the students into two groups. This was done for research purposes. The “random” activity used AR where students located a location-based trigger that would begin a video of air molecules randomly bumping into each other. The students were assigned to either the oxygen or carbon experience based on what atom “bumped” into them first. Once an atom bumped into them, the students would be given a code used to access either the carbon or oxygen experience. Oxygen Level 1 – Movement and Phase Change Through the Water Cycle After beginning the oxygen atom experience around the water in the pond, students follow their water molecule through the water cycle. They were prompted to use transparent plastic cups to gather water from the pond (Figure 17). The virtual guide delivers a question about what the students saw, without AR, in the water; students might see plankton, duckweed or floating debris. After the students answered the question, they would use vision-based AR by scanning a QR code on the side of the plastic cup to reveal a molecular animation of water. After watching the animation the students would be asked a question designed to reinforce the idea that, although atoms make up all living matter, atoms are not living. The multiplechoice question was designed to give students feedback directly related to their response. Figure 17. Image of a plastic transparent cup with QR code taped to side. We envisioned students using this cup to scoop water from the pond in order to “see” water molecules in the cup. At the next location, the field trip called for learners to build a water molecule out of LEGO®. This was designed to help students understand the structure of molecules and the molecule construction was guided by the virtual guide through examples and instructions (Figure 18). Figure 18. Example of instructions presented to students when asked to build a molecule out of LEGO® blocks. The next three locations were designed to portray the physical movement of the water. Students were asked to carry the water in the cup, and pour it near the roots of a tree. Nearby this location was a vision-based label depicting tree roots (Figure 19) positioned on the ground near the roots. When students viewed this visionbased label through the AR interface, it triggered an animation of their oxygen atom (as part of a water molecule) getting absorbed by tree roots. Using another visionbased label on a tree trunk, students see an animation of their water molecule passing through the xylem (Figure 20) of the tree. The third vision-based label, hanging from a tree branch, would reveal their water molecule going through the stomata and exiting a leaf into the atmosphere as water vapor (Figure 21). Thus, students could follow the process of transpiration, which involves a phase change of water from liquid to vapor. Figure 19 (a) Vision-based AR label of roots placed on the ground near a tree and (b) molecular-level animation of water molecule entering roots that was triggered upon scanning of the vision-based label. ( animation used with permission from Wisdom Tools) Figure 20. (a) Vision-based AR label of bark placed on the trunk of a tree and (b) molecular-level animation of water molecule transported through the trunk that was triggered upon scanning of the vision-based label. (animation used with permission from Wisdom Tools) Figure 21. (a) Vision-based AR label of leaves placed in the branches of a tree and (b) molecular-level animation of water molecule leaving through a stomata that was triggered upon scanning of the vision-based label. (animation used with permission from Wisdom Tools) The next hotspot showed the oxygen atom as part of the atmosphere and connected to the concept of water vapor and humidity. At this hotspot, we wanted students to make the connection that humidity is the result of the number of water molecules in the air and called for the students to use a weather application to find out the humidity of their location at that specific time. After checking the humidity, the virtual guide directed the students to a YouTube video that showed water vapor condensing into clouds. The final step of level one is their oxygen atom falling back into the pond in the form of a raindrop, thus completing one pathway through the water cycle. Before advancing to level two, the students were required to answer a number of questions designed to review the concepts, like conservation of matter and transpiration, just experienced in level one of the oxygen Atom Tracker (Figure 22). Figure 22. An example of a multiple choice question students had to answer correctly in order to move on to Level 2. Oxygen Level 2 – Chemical Transformation through Photosynthesis and Respiration Oxygen level two was designed to follow a similar outline as level one. The students would access location-based triggers and animations around the pond and trees. This time the students would return to the same tree and follow their water molecule through the roots into the trunk and the leaves. However, upon viewing the vision-based label in the leaves for the second time, they would see their molecule participate in photosynthesis (Figure 23), rather than transpiration. Figure 23. Molecular animation of photosynthesis happening inside of a chloroplast. (animation used with permission from Wisdom Tools) After being released from the tree as part of an oxygen molecule, the students return to the LEGO® station and are guided through the process of photosynthesis using the LEGO® pieces. The students tracking oxygen atoms had to meet up with another student in order to have the right combination of molecules to “do” photosynthesis. The oxygen atoms from the two students were combined into a single oxygen molecule. Therefore, the students were instructed to work together for the rest of the Oxygen Level 2 experience, and the pair of students were instructed to track the molecule as it moves in the atmosphere and diffuses across the surface of the pond and becomes dissolved oxygen in the pond water. Eventually, this dissolved oxygen molecule is taken in by a fish and used during respiration, which students viewed through 3D visual simulations within the AR (Figure 24). Figure 24. 3D animation of cellular respiration in a mitochondrion. (animation used with permission from Wisdom Tools) The students were then led back to the LEGO® hotspot where they learned that their oxygen atom had again become a water molecule as a product of respiration. More embedded questions ensured that students understood the products of respiration, and that oxygen atoms are neither created nor destroyed in the process – to reinforce the idea of conservation of matter. As we designed the specific animations and visualizations that would be triggered to illustrate movement and transformation, we worked closely with graphic designers and artists, who were familiar with representing science content and molecular processes. For each instance of visualization or animation, we chose to use a macroscopic physical object (for example, water or a tree) as a relatable starting point (Figure 16 and 17). For the “zoom” views, we chose to use contextual cues derived from typical middle school instructional materials. We used chloroplasts and mitochondria as backdrops for the processes of photosynthesis and respiration (Figure 23 and 24), respectively, and used classic ball-and-stick representations to depict the physical structure of the molecules. This approach offers a dramatically simplified model compared to the true complexity of the concurrent processes occurring in plant and animal cells, but aligns with representations that should be familiar to teachers and students in middle school grades. Carbon Level 1 – Chemical Transformation through Photosynthesis and Respiration The carbon activity was similar to oxygen, and was designed to provide a complementary experience. The storyline began with the carbon atom as part of a carbon dioxide molecule, which gets taken up by a duckweed plant and incorporated into a starch molecule after undergoing photosynthesis (Figure 14). After being taken up by a duckweed plant and incorporated into a starch molecule, the students visit another location-based hotspot where the duckweed gets eaten by a virtual duck. The students scanned a vision-based label that revealed a virtual duck at the farthest proximity; when the student moved the mobile device closer to the target (i.e., when they “zoomed in”), they could see the starch molecule inside of the duck’s stomach (similar to Figure 8). Students then follow the carbon atom to the next location-based hotspot where they see the starch molecule had been excreted from the duck and had sunk to the bottom of the pond. At this location, a location-based trigger activated a short video showing the process of bacteria breaking down the starch molecule during respiration (Figure 25) and the carbon atom becomes a part of a carbon dioxide molecule once again. This signified the end of level 1, and the students were prompted to answer a few review questions in order to move on to level 2. Figure 25. Screenshot of an animation showing cellular respiration happening within a bacterium. (used with permission from Blot Media Limited) Carbon Level 2 – Time Travel to Prehistoric Times During level 2, students were “transported back in time” to see what might have happened to their carbon atom if it had been a part of a starch molecule buried in the mud of a pond 62 million years ago. We conveyed a sense of time travel using a short animation that showed movement and molecular change over a timeline (Figure 26). We used text and an animation to show how the starch molecule was converted into a hydrocarbon through catagenesis. Then the students moved to another hotspot where they returned to the present time, saw an image of an oil rig, and learned about how the carbon atom they were tracking was converted from a hydrocarbon to carbon dioxide during combustion in a car engine (Figure 27). Given the complexity of the molecules involved in Carbon Level 2, we decided not to use LEGO® to represent the molecules, but students did collect a LEGO® molecule of carbon dioxide at the end of Level 2. Finally, student met up with classmates who had tracked an oxygen atom in order to complete Level 3. Figure 26. Timeline animation showing a rapid sequence of different molecules according to the relative time span over which the molecule persisted. (animation used with permission from Wisdom Tools) Figure 27. Text-based description of combustion of hydrocarbons into carbon dioxide. Level 3 – Bringing Oxygen and Carbon Together to Perform Photosynthesis Finally, for level 3, students from the carbon and oxygen experiences met up and they were presented with the goal of creating a glucose molecule. They were shown a map of locations representing various kinds of molecules and processes (Figure 28). They had to work together to decide which “molecule” locations to visit in order to collect appropriate building blocks, and then which “process” locations to visit in order to combine their building blocks in the right way to produce glucose. Figure 28. Screenshot from the FreshAiR editor showing the virtual hotspots used in Level 3. Reflections on EcoMOBILE V2.0 in the field The implementation of the design was to take place at a pond near the school in early October. The students from four seventh-grade biology classes would participate in two separate groups on a single day. The students would have about two hours on location at the pond for the Atom Tracker experience. However, a storm was forecast for the original day of the field trip and we together with the teacher decided to postpone the field trip until the following week. Unfortunately, the postpone date coincided with Hurricane Sandy hitting the East Coast, and the field trip was postponed again. Due to scheduling conflicts, it became impossible to conduct the field trip at the nearby pond. The decision was made to move forward with the field trip, but to conduct it in a green space near the school adjacent to the parking lot. Also, the field trip would have to be condensed into forty-five minutes rather than the two hours allotted in the original design. These changes created a cascade of issues that helped reveal vulnerabilities in our design of instructional materials that combine emerging technologies with school and outdoor contexts. The research team use portable video recorders and shadowed a few students during each field trip to capture the students’ experiences. Below we describe in detail the vulnerabilities that were revealed and share reflections that may benefit others working in this emerging space. Vulnerabilities revealed by taking Atom Tracker to the field The changing of locations and timeframe from two hours to forty-five minutes significantly impacted the ability to accomplish the learning goals outlined in the design of the Atom Tracker experience. The 3G connectivity in the new location for the field trip was poor, and this caused significant delays in loading digital assets. The design team in collaboration with the software developers had attempted to plan ahead for this potential issue by creating an option to pre-load digital assets ahead of time over WiFi or in a location with strong 3G internet signal, instead of loading them in the field during the experience. However, the last minute change in location required updating the location of the hotspots, which meant that the digital assets had to be re-loaded at the time of game play. The change in location introduced a number of problems, as the design had been specifically tailored to be experienced at the pond location (it was conceived as a “place-dependent” design (Dunleavy and Dede 2013; Kamarainen et al. 2015). On the day before the field trip, the design team updated much of the language in the experience so that the user experience would be coherent, but could not rearrange the entire design. This meant students still had portions of the experience that prompted them to observe the pond nearby, and this dissonance between information in the text of the experience and what students could see and do in the new location was clearly difficult for the students to interpret. Many students were confused when the language in the experience referred to the pond. On the other hand, we were able to seamlessly move the hotspots that had been referring to trees nearby the pond. We were able to affix the vision-based labels that would show transpiration to a tree nearby the school (Figure 16). Following this activity, the design team reflected on design mechanisms that would allow contextualization, without being so tied to local features that the design becomes unusable in another location (Kamarainen et al. 2015. We found that considering design moves that offer flexibility in the mode and location of implementation is particularly important when working in outdoor contexts. During the field trip, the design team and teacher recognized the students seemed to be moving quickly through the experience and not spending as much time viewing the molecular animations as we had expected. When asked questions about what they were “seeing,” some students offered appropriate explanations, for example, here is the transcript of an exchange between a student (S) and a researcher (R) who were viewing the animation in Figure 20: S: “Oh hey – it’s going up. That’s the oxygen molecule going up in the tree, right?” R: “Um, yeah, as part of what kind of molecule? What kind of molecule is your oxygen atom in?” S: “In an oxygen molecule? Uh no, in a water molecule.” However, many of the students respond with superficial answers like “I’m seeing my atom in this tree”. When further prompted by a member of the research team or a teacher, the students would slow down and realize a need for more interpretation. Through questioning about what the different colors represented, what the circles represented, and which one was their atom, the students were able to unpack the animation, and a few students had “ah-ha” moments when they realized they were watching photosynthesis or respiration. For example, here is the transcript of an exchange between a student (S) and the researcher (R) who were viewing the animation in Figure 24, S: “Oh, cool.” R: “What do you see? S: “It’s like bacteria or something and this is the atoms inside of it.” R: “Can you tell what’s happening?” S: “They’re inside the bacterium. Oh, and then they’re like coming out (pause). Oh, so it’s like using something cause when it comes out it doesn't have any of the white, oh no, it does. (pause) It separates it, and it makes it smaller. Um, yeah, they’re in big chunks, like the molecules, then they’re in smaller chunks here.” R: “Do you know what that process might be called?” S: “Umm, respiration?” However, based on uninterrupted observations of student interactions with the animations, it is likely that, without prompting, many students viewed the animations without understanding what they were viewing. It became clear that many of the students would have benefitted from additional scaffolding to support interpretation of these molecular animations of ecological processes, and situating these animations in a real-world context was not always sufficient to build connections between observation and interpretation. A debriefing among researchers and the teacher following the field trip gathered observations and reflections about logistics and design issues, these are presented below in a raw form to allow the reader an insider perspective on the implementation. Logistics • It took 45 minutes to set up the targets and LEGO® activities. Wind was a problem and a few targets were blown away during the course of the day. • A 45-minute period is not sufficient for students to use this new technology and also tackle learning goals. A few students we spoke to were able to demonstrate understanding of the learning goals intended, but many showed substantial misconceptions at the end. • A PowerPoint introduction to the AR software was not sufficient - In the future students should have a chance to do a short “intro” (~20 minutes may be enough) experience with the phones ahead of using them during a learning activity - Such an activity should include an example of the types of things they’ll need to do with the phones: login/logout, navigate among menu options, find hotspot, use history, answer questions, use and adjust audio, and basic troubleshooting tips. - The intro could also give an overview of how the game works (the idea of levels and how to progress through the game) - Caveat - these students were exposed to the intro materials up to 10 days in advance of the implementation, so perhaps a visual introduction would be sufficient if it occurred immediately before implementation . Design components • The LEGO® were useful. Students enjoyed them and also referred to them when answering questions about what was happening to their atom. We could possibly look for ways to integrate them even further, by asking students to refer to them regularly. When students were asked to puzzle through what was happening in an animation, they would refer to the LEGO® molecules. • An initial question that required them to read carefully seemed to drive home the need to read carefully through the experience. • Hotspot references that were divorced from the physical location did not work. For example, we tried to ask students to imagine they were near the edge of the pond (since we couldn’t really be at edge of pond), but this was just confused the students. • All animations should include audio narration to help with interpretation. o In some cases narration links were broken, or students didn’t have the volume turned up enough. Design toward Learning Goals • Overall, the learning goals around atoms and molecules require more scaffolding. o The various forms of representation require more scaffolding o Animations of molecules in the air and soil were being misinterpreted • Misconceptions recognized: o Conflating the identity of atoms and molecules o Students were clear they were tracking an atom, but would confused atoms and molecules in their description of activities, processes and animations o We may need to start with a simpler “level” that focuses on distinguishing between atoms and molecules and scaffolds students through various representations (animations, processes) • Conservation of matter: o On questions about conservation of matter (and atoms being recycled) some students used other clues within the question to answer it, but still held misconceptions that atoms could be created or destroyed through photosynthesis and respiration. The activity as it is designed may be more fitting for 8th graders who had previously been exposed to the particulate nature of matter. The team realized significant time commitments (above those anticipated) associated with both setting up and carrying out the activity. While students were relatively savvy with handling and navigating the mobile technology and responded well to the logistical training, they could have used more time to internalize the goals and context of the activity. The design team might have better facilitated student learning with the AR experience by more explicitly communicating goals of the activity and managing expectations about outcomes and end points. Also, designing two versions of the Atom Tracker activity helped us explore the relative value of place-dependent and place-independent designs (Kamarainen et al. 2015; Dunleavy & Dede 2013). With version 1.0, we concluded that our design was too place agnostic – students couldn’t see the purpose in walking a long distance to a virtual location called “atmosphere”, so in version 2.0 we designed a version that was tightly tied to the physical resources specific to the location at hand. In the end, we saw the pendulum swing too far and this highly place-dependent version became difficult to use when the location had to change at the last minute. In the next iteration of this experience, we have included references to physically present elements (like trees), but have found it useful to use largely place-independent designs. The AR interface provided compelling ways to combine multiple representations of molecular processes and to present these in ways that bridge across spatial and temporal scales. However, we found that interpreting multiple representations within the current design was challenging for students, and on the next iteration we hope to build in more scaffolding for interpreting the various forms of representation. This might be done through audio explanations used in association with the molecular animations, or could possibly be accomplished through having the students work in pairs throughout the experience and embedding opportunities for peer discussion and interpretation. Atom Tracker used the emerging technology, augmented reality, to help students understand complex scientific concepts that are generally unobservable. The design team recognized the promise of using AR to help students “see” and experience processes that have traditionally been taught in very abstract ways. Through AR, students were able to “see” atoms taking place in a real-world context. Although there were limitations and failures in the design, Atom Tracker represents a promising experiment in using emerging technology to address a complicated learning goal and these ideas can be built upon in future work. ACKNOWLEDGMENTS EcoMOBILE research was supported by National Science Foundation grant no. 1118530 and by the Qualcomm Wireless Reach Initiative. EcoMUVE research was supported by Institute of Education Sciences grant no. R305A080514. Augmented reality activities were developed using FreshAiR by MoGo Mobile, Inc. Molecular animations and 3D models were produced by Wisdom Tools and Half Full Nelson. Videos of bacterial respiration were produced by Blot Media Limited. Comic strip art used in EcoMUVE Atom Tracker was created by Singer Ko, through PublicVR. All opinions, findings, conclusions, or recommendations expressed here are those of the authors and do not necessarily reflect the views of the Institute for Education Sciences, the National Science Foundation, or Qualcomm, Inc. We are also grateful for support and assistance from M. Shane Tutwiler, Lin Pang, Trisha Vickery, Malik Hussain, Maria Anaya, Roshi Razavi, Tim Johnson, Janet Dykstra, and Mayer Chalom. REFERENCES Ainsworth, S. (1999). The functions of multiple representations. Computers & Education, 33(2-3), 131–152. doi:10.1016/S0360-1315(99)00029-9 Anderson, C. W., Sheldon, T. H., & Dubay, J. (1990). The effects of instruction on college nonmajors' conceptions of respiration and photosynthesis. Journal of Research in Science teaching, 27(8), 761-776. Chang, H.-Y., Quintana, C., & Krajcik, J. S. (2009). The impact of designing and evaluating molecular animations on how well middle school students understand the particulate nature of matter. Science Education, n/a–n/a. doi:10.1002/sce.20352 Chiu, M. H., Chou, C. C., & Liu, C. J. (2002). Dynamic processes of conceptual change: Analysis of constructing mental models of chemical equilibrium.Journal of research in science teaching, 39(8), 688-712 Dunleavy, M., & Dede, C. 2013. Augmented reality teaching and learning. In M. J. Bishop & J. Elen (Eds.), Handbook of research on educational communications and technology (4th ed., Vol. 2). New York: Macmillan. Grotzer, T.A. (2012). Learning causality in a complex world: Understandings of consequence. Lanham,MD: Rowman Littlefield.,(2012) Kamarainen, A.M., Metcalf, S., Grotzer, T. & Dede, C. (2015) EcoMOBILE – Designing for contextualized STEM learning using mobile technologies and augmented reality. In: Crompton, H. & J. Traxler (Eds.) Mobile Learning and STEM: Case Studies in Practice, Routledge. Lin, C. Y., & Hu, R. (2003). Students' understanding of energy flow and matter cycling in the context of the food chain, photosynthesis, and respiration. Int. J. Sci. Educ., 25(12), 1529-1544. Harrison, A. G., & Treagust, D. F. (2003). The particulate nature of matter: Challenges in understanding the submicroscopic world. In Chemical education: Towards research-based practice (pp. 189-212). Springer Netherlands. Pallant, A., & Tinker, R. F. (2004). Reasoning with atomic-scale molecular dynamic models. Journal of Science Education and Technology, 13(1), 51-66. Özmen, H. (2011). Effect of animation enhanced conceptual change texts on 6th grade students’ understanding of the particulate nature of matter and transformation during phase changes. Computers and Education, 57(1), 1114–1126. doi:10.1016/j.compedu.2010.12.004 Smith, C., Wiser, M., & Anderson, C. W. (2004). Implications of Research on Children ’ s Learning for Assessment : Matter and Atomic Molecular Theory By, 1–79. Stern, L., & Roseman, J. E. (2004). Can middle-school science textbooks help students learn important ideas? Findings from Project 2061's curriculum evaluation study: Life science. Journal of research in science teaching, 41(6), 538-568. Taber, K. S., & Watts, M. (1996). The secret life of the chemical bond: students’ anthropomorphic and animistic references to bonding. International Journal of Science Education, 18(5), 557–568. doi:10.1080/0950069960180505 Talanquer, V. (2009). On Cognitive Constraints and Learning Progressions: The case of “structure of matter.” International Journal of Science Education, 31(15), 2123– 2136. doi:10.1080/09500690802578025 Talanquer, V. (2011). Macro, Submicro, and Symbolic: The many faces of the chemistry “triplet.” International Journal of Science Education, 33(2), 179–195. doi:10.1080/09500690903386435 Valanides, N. (2000). Primary Student Teachers’ Understanding of the Particulate Nature of Matter and Its Transformations During Dissolving. Chemistry Education Research and Practice, 1(2), 249. doi:10.1039/a9rp90026h Wilensky, U., & Resnick, M. (1999). Thinking in Levels : A Dynamic Systems Approach to Making Sense of the World. Journal of Science Education and Technology, 8(1), 3–19. Wilson, C. D., Anderson, C. W., Heidemann, M., Merrill, J. E., Merritt, B. W., Richmond, G., … Parker, J. M. (2006). Assessing Students ’ Ability to Trace Matter in Dynamic Systems in Cell Biology, 5, 323–331. doi:10.1187/cbe.06 Wu, H. K., & Shah, P. (2004). Exploring visuospatial thinking in chemistry learning. Science Education, 88(3), 465–492. doi:10.1002/sce.10126 Xie, Q and R. Tinker. 2006. "Molecular dynamics simulations of chemical reactions for use in education."Journal of Chemical Education 83, no. 1 (2006): 77.