ai in education /research/ai-institute/ en Considering Learning and Evidence of Impact in Evaluating the Potential of AI for Education /research/ai-institute/2024/10/29/considering-learning-and-evidence-impact-evaluating-potential-ai-education <span>Considering Learning and Evidence of Impact in Evaluating the Potential of AI for Education</span> <span><span>Amy Corbitt</span></span> <span><time datetime="2024-10-29T10:10:09-06:00" title="Tuesday, October 29, 2024 - 10:10">Tue, 10/29/2024 - 10:10</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/research/ai-institute/sites/default/files/styles/focal_image_wide/public/people/bill_penuel_headshot_600_0.png?h=83614ab5&amp;itok=GBUpRdT4" width="1200" height="600" alt> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/research/ai-institute/taxonomy/term/189"> Blog </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/research/ai-institute/taxonomy/term/217" hreflang="en">School Administrators</a> <a href="/research/ai-institute/taxonomy/term/218" hreflang="en">Teachers</a> <a href="/research/ai-institute/taxonomy/term/213" hreflang="en">ai in education</a> </div> <a href="/research/ai-institute/william-penuel">William Penuel</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-text" itemprop="articleBody"> <div><p><em><span>William R. Penuel is a professor of learning sciences and human development in the School of Education at the Â鶹ĘÓƵ. His current research examines conditions needed to implement rigorous, responsive, and equitable teaching practices in STEM education. At iSAT, he is a Co-Principal Investigator and Co-Lead of Strand 3 - which focuses on inclusive co-design processes to empower stakeholders with diverse identities to envision, co-create, critique, and apply AI learning technologies for their schools and com­munities.</span></em></p><p dir="ltr"><span>As school and district leaders, you are used to building planes while flying them. But the advent of AI—specifically Generative AI—in classrooms has caught many of us off guard and not sure what airspace we’ve entered. Generative AI is the technology behind popular tools like ChatGPT, as well as tools today that use AI to help teachers build lesson plans and assessments for use in their classrooms. It’s a specific kind of AI that learns from the data it’s been fed (such as text, video, or images) to create new content. If you’ve tried it out, you may be impressed both by its capabilities to simulate human interaction, as well as its limitations.</span></p><p dir="ltr"><span>As an education leader, Generative AI presents many interrelated challenges to you, to teachers, to parents, and to students pertaining to safety, transparency, and ethics. In this blog post, we want to focus on two other central issues that Chief Academic Officers, district technology leaders, principals, and instructional coaches should keep in the foreground when evaluating the potential integration of AI into schools:&nbsp;</span><em><span>learning and&nbsp;evidence of impact</span></em><span>. Learning has to do with both our goals for learning and how we support them.&nbsp;</span><em><span>Evidence of impact</span></em><span> has to do with the power and limits of tools to achieve those learning goals. Good evidence also involves evidence of what’s required of teachers to implement tools well, to achieve benefits for students. Both these considerations are important in evaluating Generative AI and other tools, but often they live in the background of discussions about Generative AI.</span></p><p><span>Take the discussion of the potential of Generative AI for personalization and differentiation of learning. This is chief among the advantages that advocates of AI tout. The questions to consider are:&nbsp;</span><em><span>What kinds of learning goals can Generative AI support?&nbsp;What do we know about the potential of Generative AI for supporting these goals?</span></em></p><h4><span>Intelligent Tutors Help Personalize Individuals’ Mastery of Discrete Knowledge and Skills</span></h4><p dir="ltr"><span>There is more than 50 years of research on intelligent tutoring systems (ITSs) that we can draw on to give us a sense of what learning goals AI for personalization can support. ITSs are trained when their developers subdivide knowledge to be taught into smaller components—skills, abilities, and concepts—allowing ITSs to recommend tasks based on a student’s mastery level. There’s a large body of&nbsp;</span><em><span>evidence of impact</span></em><span> that suggests that for the kinds of problems ITSs are used to help students with, they do as least as well as human tutors do in supporting learning.</span></p><p><span>However, while AI excels at guiding students toward specific, well-defined learning goals (like solving a math problem), it struggles with more open-ended tasks where multiple solutions exist, or where collaboration and dialogue are essential. Further, it may limit deeper engagement and valuable experiences like productive struggle or peer collaboration. The evidence base applies only to well-designed ITSs, as well. Many of the Generative AI tools today can’t achieve the results of the best ITSs. While they are good at handling requests in everyday language, many of these tools still give&nbsp;</span><a href="https://www.nytimes.com/2024/07/23/technology/ai-chatbots-chatgpt-math.html" rel="nofollow"><span>inaccurate answers to math problems</span></a><span> students encounter in schools.</span></p><p><span>This is not to say that Generative AI won’t become more capable of solving math problems or helping support critical thinking, teamwork, and real-world problem solving in the future, but there is not strong&nbsp;</span><em><span>evidence of impact</span></em><span>&nbsp;for achieving these learning goals. There is even less evidence related to what’s needed to prepare teachers to use these tools well. There’s reason to be skeptical, then, about claims that the current class of tools of Generative AI can support these goals.&nbsp;</span></p><h4><span>AI Can Support Collaborative Problem Solving in Inquiry-Rich Environments</span></h4><p><span>There’s an equally rich body of&nbsp;</span><em><span>evidence of impact </span></em><span>for a set of AI tools that support collaborative learning. For more than two decades, the field of computer-supported collaborative learning has created and tested different tools focused on fostering group awareness and giving students feedback on small groups’ cognitive and social dynamics. A&nbsp;</span><a href="https://journals.sagepub.com/doi/full/10.3102/0034654318791584" rel="nofollow"><span>review</span></a><span> of these kinds of group awareness tools show improvements to students’ knowledge and skill, as well as group task performance and social interaction in collaborative learning. The relevance of these findings for K-12 schools, though, is not as clear, because many of these tools were designed for online environments in higher education.&nbsp;</span></p><p><span>Here’s where emerging research comes in – the kind designed to build evidence of impact grounded in a robust vision for teaching and learning. The Institute of Student AI-Teaming is developing&nbsp;</span><a href="/research/ai-institute/our-products/ai-partners-and-tools" rel="nofollow"><span>AI partners</span></a><span>—the Community Builder (CoBi) and the Jigsaw Interactive Agent (JIA)—that perform the key functions of group awareness tools. These tools are intended to be integrated with rich&nbsp;</span><a href="/research/ai-institute/our-products/curriculum-units" rel="nofollow"><span>curricula</span></a><span> that focus on collaborative problem solving in STEM. These tools do something very different from what Generative AI tools as currently used to plan instruction or support personalization do: they help students learn to collaborate more effectively and equitably. They support a different kind of&nbsp;</span><em><span>learning</span></em><span>, too, one that is focused on students figuring out ideas and solving problems together, using disciplinary practices from STEM that are targeted in today’s standards. And while we are still gathering&nbsp;</span><em><span>evidence of impact</span></em><span>, we already know that students are using some collaborative solving skills more when they are using an AI partner to support their learning. We aim to make these partners—and the instructional materials to teach about AI—available to schools for free in the coming year.</span></p><h4><span>Questions to Ask Â鶹ĘÓƵ Learning and Impact</span></h4><p><span>AI is here to stay, and as a leader, you know you have an obligation to approach how to use AI responsibly and ethically to achieve your vision for teaching and learning. No doubt, AI may now or in the future be useful for increasing efficiency in how teachers plan and how students develop discrete knowledge and skill. As vendors continue to rush to offer generative AI products to schools and districts, it’s important to ask three questions:</span></p><p dir="ltr"><em><span>What kind of learning does this tool support?</span></em></p><p dir="ltr"><em><span>What kind of preparation do teachers need to use the tool well?</span></em></p><p dir="ltr"><em><span>What evidence of impact is there for the claims being made about Generative AI?</span></em></p><p dir="ltr"><span>Integrating AI into classrooms is likely to lead to changes in how teachers teach and how students learn. Teachers will need support in learning how the AI works, and how to use AI tools to support teaching and learning that is consistent with what we know about how students learn. A generative AI chat bot doesn’t understand how people learn, no matter how skillful its interactions seem. That leaves it as your responsibility as a critical consumer of AI tools to ask tough questions of vendors about their ideas about teaching and learning and to demand they present evidence of bold claims about the power of AI.</span></p><p dir="ltr"><span>Now is a moment when we are all particularly open and keen to learn about AI, and it is as imperative as ever to create opportunities where educators and leaders can learn together about the potential and limits of Generative AI and other tools that support learning goals for collaborative problem solving. We not only have to be “in the loop”: as decision makers about teaching and learning, we need to stay at the center, working at a pace that protects both our children and takes care of our visions for teaching and learning and that follows evidence more than hype.</span></p><p>&nbsp;</p><p>&nbsp;</p></div> </div> </div> </div> </div> <div>As school and district leaders, you are used to building planes while flying them. But the advent of AI—specifically Generative AI—in classrooms has caught many of us off guard and not sure what airspace we’ve entered. Generative AI is the technology behind popular tools like ChatGPT, as well as tools today that use AI to help teachers build lesson plans and assessments for use in their classrooms. </div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Tue, 29 Oct 2024 16:10:09 +0000 Amy Corbitt 841 at /research/ai-institute Where Does the Data Go? A Behind-the-Scenes Look at iSAT’s Security Measures for Classroom Data Collection and Handling /research/ai-institute/2024/10/17/where-does-data-go-behind-scenes-look-isats-security-measures-classroom-data-collection <span>Where Does the Data Go? A Behind-the-Scenes Look at iSAT’s Security Measures for Classroom Data Collection and Handling</span> <span><span>Amy Corbitt</span></span> <span><time datetime="2024-10-17T19:24:29-06:00" title="Thursday, October 17, 2024 - 19:24">Thu, 10/17/2024 - 19:24</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/research/ai-institute/sites/default/files/styles/focal_image_wide/public/2024-10/Screenshot%202024-10-17%20at%203.28.47%E2%80%AFPM.png?h=a888e872&amp;itok=mYMfILiq" width="1200" height="600" alt="Data Blog Screenshot "> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/research/ai-institute/taxonomy/term/189"> Blog </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/research/ai-institute/taxonomy/term/213" hreflang="en">ai in education</a> <a href="/research/ai-institute/taxonomy/term/211" hreflang="en">data collection</a> <a href="/research/ai-institute/taxonomy/term/212" hreflang="en">secure data</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-text" itemprop="articleBody"> <div><p dir="ltr"><span>By Charis Clevenger</span></p><p dir="ltr"><em><span>With a Master's in Family and Human Development,&nbsp;</span></em><a href="/research/ai-institute/charis-harty" rel="nofollow"><em><span>Charis’s</span></em></a><em><span> personal research interests include AI in education, relationship building, and learning through collaboration, equity in public schools, and viewing learning through the biopsychosocial model.</span></em></p><p dir="ltr"><span>Do you ever wonder what happens to student data once the microphones and cameras are out of the classroom?&nbsp;With AI in education, there can be a lot of questions and concerns about how CU Boulder is protecting students’ information, whether it be their name, voice, image, or even the work they submit in class. It is challenging enough to navigate the school age years – worrying about how data remains secure shouldn’t be one of the contributing factors.</span></p><p dir="ltr"><span>My name is Charis Clevenger, and I am the data manager for the Institute of Cognitive Sciences and iSAT. As a mother and former educator, the protection of vulnerable populations including our children is a critical motivating force in my role as data manager. Having been with iSAT since its founding (we are now in year 5), I make it a priority to ensure that we keep up to date with the latest best practices and safest measures for securing the data we collect.</span></p><p dir="ltr"><span>iSAT, as a whole, is committed to following the&nbsp;</span><a href="https://www.sciencedirect.com/science/article/pii/S0048733313000930" rel="nofollow"><span>Responsible Innovation Framework proposed by Stilgoe and colleagues (2013)</span></a><span> where we protect the future from harm by emphasizing a stewardship of science and innovation in the present. Below are some ways how we apply this framework for our research policies on collecting data in classrooms.&nbsp;</span></p><p dir="ltr"><span><strong>Anonymizing personally identifying information at every stage</strong></span></p><p dir="ltr"><span>The first step after we collect data involves removing any information from the data that can identify a student participant. For this, we use study IDs instead of students’ real names. We also anonymize any information about their context, whether it’s who their teacher is, which school they attend, and what district they are in. Additional measures we take are:</span></p><ol><li dir="ltr"><span>Using untraceable identification numbers,</span></li><li dir="ltr"><span>Blurring videos used for general analysis,</span></li><li dir="ltr"><span>Transcribing speech to minimize the need for additional video use.</span></li></ol><p dir="ltr"><span><strong>Ensuring raw data is secure once collected</strong></span></p><p dir="ltr"><span>Data is kept on secure servers that are password protected. Data collectors follow rigorous cyber security protocols and safeguards such as never “staying logged in” to any data networks.</span></p><p dir="ltr"><span>Additionally, iSAT has put into place the careful curation of datasets based on specific needs from our in-house expert research teams. This happens only after the collected data has been rigorously checked and rechecked for any issue that could reveal identifying information. For example, suppose there is a school announcement made over the intercom during data collection and it may contain identifying information about the school; if this ends up being audible on the recording, we remove it. In doing so, our team ensures that collected data has to pass several levels of inspection and cleaning as well as move through various access control channels before it ever gets forwarded to research teams. And then we also track what data is being used and by whom. This minimizes the access to data that is not necessary to complete research by any given team.</span></p><p dir="ltr"><span>In summary, it is imperative to update and refine security measures that protect the privacy of student participants. That is why iSAT has created a system that runs all collected data through various pre-processing and cleaning stages, limits access to data for research purposes only, and securely stores data for the lifetime of its use.&nbsp;</span></p></div> </div> </div> </div> </div> <div>Do you ever wonder what happens to student data once the microphones and cameras are out of the classroom?&nbsp;With AI in education, there can be a lot of questions and concerns about how CU Boulder is protecting students’ information,</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Fri, 18 Oct 2024 01:24:29 +0000 Amy Corbitt 836 at /research/ai-institute