0:00:03.52 | 28.2s | Ed Clark | We can't change something we don't participate in. We all need to participate. We need people to criticize it, write about it. We need to then get our stakeholders to go to Open AI and say like we saw this happen when we asked for an image generation and you know it spit out these kinds of things we need to have that conversation, but if we don't participate, we're not having that conversation and it extends the bias and it makes things worse and worse over time. |
0:00:47.11 | 73.2s | Ingrid Nuttall | Hi everybody. Welcome to another episode of Heard. Artificial intelligence or AI has finally hit the HEARD pod waves. We were joined by returning ̽»¨Â¥ pod guest Ed Clark, Chief Information Officer at the California State University System or CSU, to talk about their significant investment in a far-reaching AI strategy and how he believes it will support the nation's largest 4-year public, highly diverse university system. I talked about what AI is good for and what it's not, the challenges it presents for diversity, equity and inclusion, and also how it can empower new research and ideas across a variety of disciplines. A key takeaway from this conversation, you have to be part of what is going on if you want to understand it, and if you don't understand it, you really can't help make it better. A quick plug before we begin, I strongly recommend you also listen to Ed's interview on For the Record with Kemal Badur Chief Technology Officer at the University of Chicago. It's season 6, episode 4. Ed mentions Kamal a couple of times in this conversation, and I promise you, you will enjoy that episode with the two of them. All right, let's get started. |
0:02:05.16 | 3.6s | Ingrid Nuttall | Welcome to another episode of Heard. I'm Ingrid Nuttall. |
0:02:09.8 | 1.1s | Portia LaMarr | I'm Portia LaMarr, |
0:02:10.55 | 1.6s | Tashana Curtis | and I'm Tashana Curtis. |
0:02:12.24 | 13.5s | Ingrid Nuttall | And joining us on the pod today is Ed Clark, CIO for the California State University System to talk about all things artificial intelligence, um, what's been happening in California. Ed, welcome to Heard. Welcome, |
0:02:25.77 | 1.7s | Tashana Curtis | welcome. |
0:02:27.72 | 1.6s | Ed Clark | So happy to be here. Thank you. |
0:02:30.0 | 10.1s | Ingrid Nuttall | So we always begin by asking our guests to tell the ̽»¨Â¥ community a bit about themselves and their role and their higher education journey. Ed, can you share a little bit about yourself? |
0:02:40.97 | 80.1s | Ed Clark | Sure, I, um, as, as, as mentioned, I'm the CIO for the California State University system. I've been in my role for about 2 years now. Prior to that, I was in a number of institutions in Minnesota for about 30 years. I worked at the University of Minnesota. I worked in the Minnesota State University system and also for the University of Saint Thomas. So lots of higher ed experience, mostly in Minnesota and now I'm doing the California adventure here. I have kind of an interesting background. I, uh, you know, my younger years, I was, uh, up until age 7, I lived in Bangkok and so uh Thai was my 1st, 1st language and uh. And then uh came over to the United States, uh, my parents settled in the South, so I have a southern accent that's how I learned English, um, and then, uh, when I went to Florida State as an undergrad, I was a Pell grant recipient out of state, I worked full time, so lots of, uh, lots of the challenges that that that now I see many of our students have in the California State University. So I have a kind of a fond spot for international students and. Students with uh with with with similar challenges and and uh of course, you know, our mission is to help all folks be successful, so I'm really excited to be part of that. |
0:04:01.86 | 34.2s | Ingrid Nuttall | So we reached out to you when we saw, I think I might have seen on LinkedIn or in the news that the California system had rolled out a significant AI investment called AI Empowered CSU and we have wanted to discuss artificial intelligence within the context of diversity, equity and inclusion on this podcast. Um, it's come up in some of our episodes, even just some of our conversations with each other, so we're really excited. Can we start with just an overview of AI empowered CSU? What are the broad strokes and what are some of the key details? |
0:04:36.48 | 109.2s | Ed Clark | Yes, I've been asked to uh talk about this in lots of places now. So the, the AI empowered CSU, of course, you know, our Chancellor wanted us to create an AI strategy that would support the mission of the CSU. And so to do that, we had to put our students first. We wanted to make sure as AI becomes the next technology that transforms everything we do, um, and I can weigh into that later. Uh, we wanna make sure that our students aren't left out, right? So we want every student regardless of major, you know, not just the STEM folks, but if you're an English major like I was, or if you're a you're a music major, you're gonna have access to these tools and learn how to use them in your context so that when you graduate you're gonna be successful whether you go in the workforce or grad school you're gonna know about these things and and and be able to leverage them uh in the ways that uh people will be leveraging them in the future. So the second part of it to get there we're gonna have to bring our faculty along we're gonna have to empower them with the same tools we have to empower them in their teaching and research. We're gonna have to give them all sorts of support to, to, to give them time and space to incorporate uh these kinds of things into their their their programs, their courses and so forth. And then finally for our staff as a CSU, you know. Our mission is to serve the state of California, to serve our constituents and our communities, and we need to become even better as a system, you know, there, there are some big holes that we know we have to solve in terms of our use of data, um, in terms of how we can promote better outcomes for our students, and we're hoping that these tools can help us achieve those goals as well. |
0:06:26.36 | 33.5s | Ingrid Nuttall | Ed, how you just said that um AI can be leveraged to provide better outcomes for our students. Can you? Dive, so I'm packing that a little bit. Um, can you dive into a little bit about that and of course in the back of my mind I'm thinking about the Maybe the irony or how that butts up against the idea of AI LLMs taking people out of some kind of process. So how, how can technology Put better outcomes for people. |
0:07:00.54 | 112.0s | Ed Clark | Well, and I'm gonna, you know, I you know I'm not gonna unless you want to, I'm not gonna dive into the deep technology things but you know one of the biggest challenges we had a uh we had a big initiative we called graduation Initiative 2025 and we wanted to have the. Best outcomes of any system in the country for students with the kinds of challenges our students have, right? And we talked about that earlier and you know by and large, you know, yeah we met we had some measurable success but it didn't reach our goals and and then so the question became well why didn't we reach our goals and it turns out, of course, every university has its own data sources. Uh, they have their own systems. So if you think about the Cal State University, 23 universities plus the chancellor's office, and people are leveraging different technologies, different systems, different data sets. Making it's a jumbled mess. I'm calling it the technology jungle we've built a technology jungle so we have these great goals on top of this mountain and we built a jungle in front of that that mountain and it's hard to cut through it so that we can get to that goal. So now, um, the old way we would have had to go. You know, with all our tech teams agree, oh we're all gonna use this tool and we're all gonna, you know, do things in this way with data, which is a really tough ask for a system our size, you know, the, the largest 4 year university system in the country. Well, the one of the promises of AI is you can kind of leave the data where it is and you can surface those ins insights that. Would have in the past required a lot of staff time, a lot of development, a lot of other kinds of things. Now we're hoping these tools can be leveraged to bring those insights together in a meaningful way without all of that. Impossible work the decade that it would take to untangle the digital jungle. |
0:08:53.21 | 9.1s | Ingrid Nuttall | Can you give a specific example, like a specific use case within higher education of that untangling of the jungle, hacking through it? Yes, |
0:09:03.14 | 103.9s | Ed Clark | and, and you know, OK, so, and I will, I want to share with you that, you know, AI is this AI technology and I don't know if we're gonna go down this path or not, it's One of the fastest moving, uh, one of the fastest evolving technologies we've ever seen, right? So when I talked to CIOs around the country, including, you know, my friend Kamal, you know, we're all saying we've never seen anything evolve or change as fast as we've seen AI evolving and changing, and as you can see the internet took 7 years to get to 100 million users and uh you know. Generative AI took, you know, a very short time, like 2 years max, right? And that's so it's way, way fast. So a lot of the answers that you're asking me to, uh, to give you, I'll give you an answer today and it might be a different answer in 6 months. So I'm just, I wanna tell the users that, right? So, but, uh, we have built what we call a data lake. I'm sure many of your institutions have to so we take all of this data structured and unstructured for all the sources we put in a data lake and because um it many of our CIOs, many of our community members thought it was almost unusable they start saying Ed what you built is a data swamp all the data is there but we can't get insights out of it because it takes all this work. And you all are in the registrars, you know what I mean by that's right, so it's just this jungle of stuff you have to normalize it, clean it, do the ETL and make get insights out of it. Well, one of the promises of AI is you can leave it just like that and put your tools on top of it and it will generate the insights that would have taken all of this extra work to um to surface. Is that a good example? |
0:10:47.33 | 26.7s | Portia LaMarr | I think that is, is it? And and I'm wondering if it also helps in kind of centralizing the way it is collected. Like everyone is using the same pool and getting the same data, but using it in the manner that their area needs to use it as opposed to the question of where did you get this from and how did you obtain it and oh, you all were collecting this data, we weren't collecting this data. So that, that to me is very interesting. |
0:11:14.48 | 52.8s | Ed Clark | Yeah, I think you're so right and and I would uh There was a, uh, I was trying to help with this uh NSF grant that we were putting in National Science Foundation grant for actually um for research computing and one of the questions I was asking my system off is like how many computer science majors do we have in the CSU because you know the National Science Foundation wanted to know that as part of the grant and then. You know, I would have thought this is an easy question to answer, but because every institution had their own way of, you know, defining that it was like it's gonna take a while, man, and I was surprised, but that is uh that's the kind of question that may be one of the problems that, OK. I can go along with that, use these tools, surface and answer that at least passes enough muster that I can put it in this NSF grant proposal, you know. |
0:12:08.9 | 71.4s | Ingrid Nuttall | So one of the um One of the connections to diversity, equity inclusion that is that I'm thinking of in this conversation is What you leave the data where it is. What you get out is only as good as what you put into it. I just came from a session right before this that was about crafting prompts, um, like effective prompts and the different strategies for doing that in AI and you sort of need to talk to, you, you need to talk to the tool in a way, in a certain way to get the best things out of it. So within the context of diversity, equity and inclusion, if we are, if we are. The people, we are just people, we are imperfect beings, we are working with imperfect technology. We have built structures that are not awesome. We have collected data that don't necessarily mean, uh, tell the full story. How does one interact with that kind of a powerful tool in a way that is mindful of the, some of the inherent problems that we already face that already exist. Um, related to like equity and data and storytelling from that data. |
0:13:20.11 | 147.3s | Ed Clark | I think that you've hit on, you know, um, I am a believer that the humans, so our, our strategy is called the AI empowered CSU because we think we believe in the humans in the middle, we need the humans in the middle who are going to work alongside, you know, they talk about having these co-pilots, right, these agents that you work alongside and we're gonna need people with expertise right to um. To validate what's coming back and in fact, um, here's something that's kind of interesting, uh, Gartner has a framework where they talk about where the sweet spots for using AI agents and tools and the sweet spots are actually if you put a uh kind of a 2 by 2 where on one. One axis you have the complexity of the job, and on the other axis you have the experience the person has. The sweet spots are actually low complexity and low experience. It's like I'm working in a call center and I'm new. Well then the, the AI bot can say, hey, you know what? Oh yeah, someone just called with a problem, go through these 10 steps. Here's what you could say to make them feel better when they seem frustrated. That's really helpful for someone who has low experience they're new at the job. Hey, OK, I just followed the script the AI bot told me and I'm gonna do a good job. Now if someone's been working at the same call center for 10 years. That's just gonna slow them down. They're not gonna wanna hear what the AI it's wasting their time to have the AI AI bot talk to them, right? The other sweet spot, interestingly enough, is high complexity and high experience. So like, oh, I'm in a law firm, right? And uh I'm new at my job. What if I have low experience? I'm in a law firm and AI says, oh, you should use this case when you go to trial, bring this case in and you probably heard that story where someone brought in a case and had made up all the facts. And I don't know if you heard of it. This is a famous case. Someone used AI, went into court, and the whole case had been made up by the AI tool, right? And then of course, of course that was like this hallucination, very damaging, right? But if you're a law firm partner, for example, you've been there for 10 years, the answers you're gonna get out of that AI could be very useful. It saves you a lot of time like. Yeah, that's something I can use that's something I can't use and I think you just brought up a great example in student success like, hey, yeah, we're gonna surface these things and it could save you time if you have enough experience to sort through the the true truths and the falsehoods in it, right? |
0:15:47.52 | 30.6s | Portia LaMarr | It's exactly the idea of the open book test. Yeah. You know, you, you went to class and you thought, I don't have to study because it's an open book test, but you didn't realize how long it's gonna take you to go through because you didn't have that foundational knowledge of it. But when you do and you do the open book test, you can go right to where you need to to help assist. That's the key word, assist you in answering the question. So it's not replacing humans, it's just making a giving you a little bit more time back like we like to say. |
0:16:18.59 | 2.2s | Ed Clark | Absolutely. I think that's so right. |
0:16:21.29 | 12.8s | Tashana Curtis | How in higher ed, how are we going to monitor AI for the students that misuse it or for anyone that misuse it? How will we monitor that? |
0:16:34.61 | 252.4s | Ed Clark | I think, OK, so I, I have so many, um, uh, one of the other things I, I do, I'm also an adjunct faculty. I teach information systems at Cal State Fullerton, so, and, uh, and I'm one of the, the folks that's like trying to encourage. My students to use the AI tools like, uh, but I asked them to cite like you used it, why did you use it? What was that, what was your prompt, what were you trying to get out of it? And they have to cite, yeah, this, this is what Chat GBT told me or whatever and then you have to like Bring in, what was your part in this because you shouldn't replace it. It shouldn't be. It shouldn't be the creator, you are the creator, right? You're, it should replace your creative practice and all these other things, but I mean there's a whole, there's a whole session that's all about that. But to your point about the cheating, it's, it's interesting. There's there's a You're, you're on the academic side of the house, I've had to deal with this question in many contexts, but what's interesting is I don't know uh how many of your audience knows this, but in 370 BC Plato wrote this thing called the Phedrus, right? And uh and it's a dialogue where he has a character, character Socrates, and but But what it does is there's a concern that the invention of writing is gonna replace critical thinking and memory, right? So it's like, hey, you know what, because they can see this writing, they don't have to know it anymore, they just pair at the writing, right? And it's gonna take away the critical thing and by the way, they don't have to remember anything anymore because it's all written down. That was the concern, right? And this stuff. It happens over, it happened with the invention of printing press um in the 80s, uh, you know, I don't know if you're, I, I'm old enough to remember in the 80s, calculator is gonna take away your ability to do math. Don't use calculators, you can't use it on the ACT so I wasn't allowed to do it. But the time my kids took the ACT, they were allowed to use the calculator, right? And um and then in the early 90s it was internet. Oh, they can go on Google, they can use Wikipedia, they're gonna cut and paste everything. And um and in fact I found an article from, I think it was the year 2001 of the quotes was like, it's undeniable that the internet has become the biggest tool for cheating ever invented for K-12 in university students. And all of that is actually somewhat true it's true that the concerns are true, um but it's also how we do everything that's why you know we all have these cell phones, that's how we do banking, it's how we interact, it's it's so crucial to how we live in today's society, right? And I think AI is destined for a similar kind of uh place. There was a survey I just saw that and like I think it was in 2008, they asked students how many um what percentage of students had engaged in of university students had engaged in academic dishonesty, you know, like something like something that violated academic integrity principles and it was 70% or 7 out of 10 students that said yeah. And they just did that same survey, um, I think in this last year and then the numbers exactly the same, it's about 70%. I don't think. I don't think you're gonna see more cheating. It'll be different kind of cheating, but I don't think you'll see more, right? So I think it is a, it's a puzzle that comes out throughout history. It is something that We need to worry about and, and we need to worry about it because we care about our students and their learning and their outcomes, right? So I, I agree with the concerns and I think that's why when I said that part of our AI we have to empower our faculty. How do you change your assessments in such a way that you are getting, um, there are ways that, you know, at the end of the year my students have to present their final project. Orally in front of their class and unfortunately it's not talking to them in their ear yet, right? They, so they have to they have to demonstrate their knowledge, what they've learned in a way that is gonna be very hard for them to uh to uh. To, to cheat on, right? And that's kind of the um I think they're just ways we're gonna have to change our teaching and learning practices just like we did for writing and books and calculators. |
0:20:47.68 | 23.4s | Portia LaMarr | How do you, what, what suggestions would you have to institutions um that are working with the faculty and, and Kind of, I mean, you've already said it, the the AI speed has gone fast and. The characteristics of higher education, sometimes we don't go that fast. How do you mesh those two together? |
0:21:12.2 | 99.2s | Ed Clark | I think, yeah, and again I said at the beginning of this episode like things are changing so quickly every 6 months. The the whole world has changed and you know, last year we were talking about prompt engineering. This is gonna be the new, the new career, right? And now the new models are are designed so that you don't even need to do the prompt engineering, right? That's part of that, uh, so it's, it's an interesting puzzle. And I think the, the good side of it, this is a good side I want all of your, uh, your, your listeners to hear is I had to say this yesterday at um Cal State Fresno. Um, I don't feel, don't feel too worried about being left behind. These things are changing so quickly that you could jump in today and be at the top of the game. That's, that's the good side of things changing so quickly. Jump in, even if you're like, hey, I don't know anything about this, if you jumped in today, took some basics, you'd be as good as anybody else out there. The bad side to your point is how do we build a sustainable like approach to this? And I think that, you know, for our students, for our faculty, for our institutions. And and and the the thing that is really most interesting about this is all of the liberal arts skills, those durable skills that people want creativity, curiosity, compassion, you know, uh, courage, right? This this ability like I'm gonna take this on and flex with it and learn as it evolves, these are the kinds of people we need now and so if you're really locked in, super structured. If you, you're gonna be more challenged at this time. |
0:22:52.27 | 48.1s | Ingrid Nuttall | And I saw a quote from you in um the CSU's press release on the, on the strategy, on the AI strategy that I wanted to read and have you unpack. So you said, um, quote, as the CSU we have two imperatives to equip our students with the skills to leverage these powerful tools and to transform our own institutional practices through AI. To better serve the largest public university system in the nation. So you spoke to some of those outcomes, but in addition to the technical skills, and I think that's where this conversation is going a little bit, I am interested in those human skills that you, those creative skills, right, that can't be automated by AI that are still needed and maybe even more so. So, are developing those human skills part of this initiative? |
0:23:40.68 | 94.5s | Ed Clark | Yeah, so, so that, to your point, as I just said, you know, those durable skills. You know there's uh uh there's a, there's a website and I, you know, that does you're you're probably more familiar with it than I am, but basically the skills that universities, you know, liberal arts colleges have been saying this is how we prepare our students. They've been saying that for a long time. These are the skills that many technology leaders are saying these are the kinds of folks that we need people who are curious, people who are confident, people that can communicate well, there's a lot of Cs in there, right? um, but. And create it, right? So one of the biggest, the biggest opportunities that we keep seeing in AI is, hey, there's a tool that was developed to do this. Oh, I could just take that and use it for this new purpose that repurposing of stuff that's coming out is, is doing a really amazing things for um for different industries, practices and we could go if if that's useful we could go into some of those you know repurposing use cases but. But really the core of that is someone creative saw this and said man I could apply that and higher it for the good of students, that's really uh that's where the win is, right? And that's kind of um. Um, it's gonna, it's gonna be less about your ability to learn how to code, you know, in a very structured manner and more about your ability to leverage tools to get the outcome that you know you're chasing after, you know. |
0:25:16.28 | 60.5s | Ingrid Nuttall | I want to, again, I, I not even just push back on that, um, but to complicate it maybe, maybe a little bit because the human element is still incredibly subjective and problematic. What makes people comfortable that someone is curious, comfortable that someone is communicating, comfortable with that human element can be just so wracked with bias. And then you put on top of that. Technology or alongside that technology that is consuming vast amounts of data that are generated by all of those same humans, right? And so I'm interested in that thought of how, like, that is true and that is humanity for the rest of time probably. And recognizing that and calling it out and figuring out how to manage that in new and creative ways feels like it needs to be a part of this project. So that's, I think the thing I'm interested in you |
0:26:18.23 | 8.8s | Ed Clark | and you've actually highlighted, OK. Again, early days, I don't, I'm not gonna have all the answers, right? So I want to make that clear to everybody, um, but we |
0:26:27.6 | 2.0s | Ingrid Nuttall | brought you here for all of the answers to all |
0:26:29.4 | 3.8s | Portia LaMarr | of you, then what are we doing here on the |
0:26:32.80 | 3.1s | Ingrid Nuttall | podcast. |
0:26:36.46 | 119.6s | Ed Clark | But yeah, I, so I, so one of the challenges we had, you know, um, you know, you may have heard that part of our strategy we built this AI workforce acceleration board because we wanted to make sure that our students were prepared for the future, right? And so, and what you'll hear and this always kind of frustrated me is if I talk to, I mean I'm talking in the past, I'd go to, let's say I'm talking to someone from Google or Deloitte or whatever you name the company, um. And they would say, you know, we want creative people, we want people who can, you know, who can see opportunities where others don't, people who think out of the box, all this stuff, right? And then you know the two frustrating things for me were one is they tended to substitute, you went to an elite college, that means you have those things. OK, well, no, that's not true. There, there are a lot of people that have those things that didn't go to a quote elite college, right? So I want a way for you to measure that and then I'll make sure that my students measure up to that, right? That is, so I think you hit hit on a huge challenge in higher ed, which is we need a way to kind of like document that to certify that and say yep, you have these durable skills and so if Deloitte wants someone that has those skills they can go to. Any college has churned out that kind of person and they will hire that person and you know in practice, you know, these, these schools and majority they would say oh yeah you went to Stanford that means you have all these things, right? And that is uh um that's a challenge for access institutions like California State University, right? And that's that's one of the things we're hoping to address with this workforce acceleration board. I have Microsoft in there, I have Google in there, I have OpenAI and I've said, OK. We want our students to get these internships and jobs with you, so we're gonna, let's figure out how to measure that and get that out there. So I don't have the answer, but I think that is one of the goals. |
0:28:36.76 | 49.1s | Ingrid Nuttall | Uh, it's, it's sort of like the AI, um, it's sort of like how AI works in its own way. So that, that shortcut, that mental shortcut for those durable skills being like, oh, I'm I don't know how to measure it. I have, I'm gonna look at your sweet spots. It is highly complex and I have low experience or something like that, and so I'm gonna use a heuristic of these things that I do understand. As stand-ins for this, the same way that we use, sometimes we use metrics like pell eligible as stand-ins for first generation or like we do this all the time so it it it's, it's sort of connected to the creation, the way AI functions is just the way people function too, right? |
0:29:26.31 | 18.4s | Ed Clark | Yeah, that's exactly right. And it's and we have to do, yeah, we have to, we have to build that future together absolutely that is one of the big challenges I think that's part where higher ed. I don't think other people are gonna figure that out for us. We have to work together to figure that out. |
0:29:45.26 | 8.3s | Ingrid Nuttall | The CSU is the most diverse public four-year university in the country. Did I say that right? Is that real? |
0:29:54.61 | 32.7s | Ed Clark | That it's OK, so that is in our fact book and uh you know, it can be measured in lots of ways I'm certain, but we are very proud of it for sure, right? And we have I think 21. And now maybe 22 coming on of our our universities are Hispanic serving institutions, um, we have all sorts of diversity, domestic diversity, international diversity, racial diversity, gender diversity. I, I think. We, we're very proud of those things and so that is how we define ourselves. |
0:30:27.85 | 20.4s | Ingrid Nuttall | I think of it as a system, right? I think of it as not just to your point, not just, not just one thing, it's a system. And so I was almost like questioning that it's not, it's not just one institution, right? Like, so how does AI empowered CSU directly support that that diversity, right, of that system? |
0:30:48.73 | 93.9s | Ed Clark | Well, you know, as I, uh, I think, you know, the, the thing, some of the things I have to remind people is, you know, uh, Cal State University was, you know, not even close to being the first to roll out AI tools to their campus. So Harvard had already done it a long time ago, Michigan, um, you know, even Kama at Chicago, he rolled out like pretty much every tool that's out there, he's given it to his campus, right? The issue was that, that just creates a digital divide, right? We're like, OK, we'll get all the elite schools with all the money, have these tools. We need our access universities to have that same kind of access and so when you talk about yeah we consider ourselves the most diverse university system in the country and we're gonna give those tools to every single student regardless of their major and make sure they understand how to use them in their context that is how we're ensuring the sustainability of our university of our programs, of our curriculum. So that we're supporting not only um our students in the moment but the outcomes for the state of California and for the whole country, you know, one of the most mind blowing stats that I've learned about the Cal State University is why not out every 20 people with a bachelor's degree in the United States got it from California State University. That's a lot of impact. So by making sure that all of these graduates have access to these tools and know how to use them, we are impacting the entire nation and really the world too. |
0:32:23.31 | 29.2s | Portia LaMarr | So what do you say so, so naming all those schools, and at the end of the day, we all know universities are businesses and they compete against each other. What do you say to the student, the school that is, you know, wanting to get out there, having that haste to get out there to, to be competitive, but uh also understanding what they're delivering? What would you say to those that I, I feel like I've seen that in, in some institutions. |
0:32:53.11 | 26.3s | Ed Clark | I see, I see. You mean like strategically, you know, the pace of change, and I'll say this, um. This is another part of, uh, you know, what I, what I'm telling. Um, uh, folks that I talked, uh, to recently, you know, in my opinion, what we're seeing is almost like the early web. You remember these huge names like Netscape and Lycos and all American America Online, |
0:33:19.45 | 3.2s | Portia LaMarr | right? Get my CD from the grocery store, yes, |
0:33:22.95 | 62.9s | Ed Clark | and those names aren't around anymore, right? And this, this churn is exactly the same as it is today. It's just churning a lot faster, right. So, but what that does, that churn makes everybody feel like I fell behind. I'm behind like look at all this change and, and so to your point, some schools are like we gotta jump in a day and have a thing and we're gonna like make sure we're, you know, in the game somehow and I think there's a lot of urgency around that, but you know, uh, I'll cite Gardner here. Gardner says, you know, there's two races going on. One is the vendor race. And when you feel like behind like that you're actually looking at the vendor race and you can't participate in that race. You're not a vendor they're gonna go put themselves out of business, which you really the race you should focus on is at my institution, how can I deliver AI outcomes safely, effectively and strategically to achieve the mission. That's what uh uh those schools that you're talking about should really focus on. What is our mission and how can we leverage these tools to to to to achieve that mission. |
0:34:27.28 | 100.1s | Ingrid Nuttall | I had to pause to write that down because I do think that that um. I mean, at my institution, it is incredibly front and center. The programming for like our academic programming has been delivered in a like democratized fashion to all staff. Like you can get this post back certificate, you can use your benefits this way, you can get these badges, you can do like there is an interest in Um, democratizing, for a lack of a better word, access to the knowledge and the ideas. And I, and I have in some of the things I've participated in, um, I feel like it is presented with a healthy amount of what you're talking about, which is like the concept of AI as a, a thing, as an idea has been around for a long time since the 70s, but the delivery of it is more recent, but has gone so fast that it's like made up for lost time. Um, and so when you don't have that kind of history of fits and starts and lo and behold and like all the little things that made the story of the internet as an example, um, you are like creating history really rapidly right now. Um, so having those mindful, safe, effective, strategically positioned to deliver is a nice grounding thing because there is history in that. Our institutions have history, our students have history. We know what's succeeded and failed and um so we have something to hold on to that's not just the technology and how it's developing. Right, |
0:36:07.59 | 1.7s | Ed Clark | right, absolutely. I like |
0:36:09.27 | 17.6s | Ingrid Nuttall | that. So you recently presented this strategy at a lunch and learn at the Capitol in Washington DC Ed Clark, that is very fancy fancy, um, can you talk to us a bit about that experience and maybe some of the outcomes you are hoping for from that? |
0:36:27.60 | 176.6s | Ed Clark | Well, I, I think, you know, this is so the outcomes are just a bit about that experience what was very cool I mean like I stayed at a I stayed in a hotel at the Hilton Capitol Hill and uh. You know, and they picked us up in a big black SUV and drove us over, uh, well, you were real fancy, yeah, yeah, drove us up to the. You know, um, you know, to the, uh, one of the big, uh, halls right there in Capitol Hill is called the Rayburn House, I believe, um, and I'll have to, I'd have to look it up. I wasn't, to be frank, I wasn't paying that much attention because I was like, so, so then, uh, my colleague and I, Leslie Kennedy, who's the associate vice chancellor for academic technologies at the system office, we gave this presentation that we've actually been giving across. The CSU and to California legislators and so they were like, hey, you're gonna go do this. And we had a great conversation with, I think there were uh It was really 6 congress people, their staff and all that, you know, so it was, it was a mix of, you know, who was attending and, you know, and I, um, but they asked a lot of questions because AI is top of mind for pretty much everybody right now whether you're in industry, government, academics, wherever you are and um this. This idea that we were addressing. Access and equity uh for for all students right instead of just a certain select number of students that we're a different kind of institution so we had to do a lot of negotiating with OpenAI and and you know it's just like hey we're, we can't afford the prices you're charging every other school so help us, you know, address this, you know, the, the, the access to tools um and then the the idea that we're actually bringing government. And these industry partners and our academics in the room like faculty, Senate members, president, provosts to actually look at our curriculum and we're gonna try to listen to them and they're gonna listen to us and we're gonna try to change the way we prepare our students for the future. It's a really amazing, amazing journey to start, and we just kicked off that journey. We had our first meeting last month. But everybody was like, well, was it good? And I was, yeah, the answer, it was really good. Everybody was excited to be there. There was a lot of excitement, but we haven't. We just started it, right? And, and I, we have high hopes, but we're going to have to stay on top of it and generate the results that we want to generate. So it's early days. People are excited that we're doing the right thing, right? And generally like that's the right thing to do. Now we have to like prove something we have to achieve something that's where we are in this journey. But given everything changes every 6 months, you know, it, it's an appropriate place to be. Everybody can jump in right now like I said, and all of a sudden you're at the top of the game. |
0:39:24.85 | 18.8s | Ingrid Nuttall | How are you hoping to be like? Communicative, transparent, accountable through that journey. Like you said, it's a lot of, like a lot of wind up before the pitch. How, how are you gonna I can't extend that baseball metaphor, um, but you know, you know what I'm saying, are you gonna keep yourself accountable? |
0:39:44.13 | 0.8s | Ed Clark | I like, are we |
0:39:44.92 | 1.0s | Portia LaMarr | at the stadium? |
0:39:47.5 | 5.8s | Ingrid Nuttall | the stadium. And someone told me I had, that's the way I wrote a lot of wind up before the pitch, so I've never forgotten it. |
0:39:53.56 | 75.4s | Ed Clark | Nice. Well, I, um, I would say, and I can't complete the metaphor either, but I think, well, we got to deliver the pitch, right? I think. Um, but, but to, to your 0.1 of the, the most important questions we've been asked by our board, um, and this board that just, uh, we, we just establishes how are we gonna measure success? What are the metrics we're gonna gather to measure success and, and being quite honest, that is what we spent a good part of that last meeting. What are we gonna measure? what's gonna be important to us? And I'm actually meeting with our um CSU generative AI committee this afternoon. In that group too, I'm gonna say, well, how do you want to measure for teaching and learning? What do you want to measure? We're gonna capture that. We've uh retained a firm to kind of work with us to think through. OK, if that's what we're gonna measure, how would we do that? And then we're gonna start and how we track it, right? And so again, early days, but we're excited at least we're talking to all our community members and saying, yep, this is what we want to track, this is what we want to measure we're gonna build those metrics and then to your point, report back we're we're hoping in 6 months at our next big board retreat, hey, you asked us to develop these metrics is what we came up with this is the progress we've shown so far. |
0:41:10.26 | 26.6s | Ingrid Nuttall | I like that a community approach to what success looks like, um. Which could be different, could be qualitative measures too, I would imagine, in addition to quantitative ones. Yeah. So way back in 2022, a blueprint for an AI Bill of Rights was released by the Office of Science and Technology Policy in the federal government. Were you aware of, have you looked at that at all, Ed? |
0:41:37.67 | 9.9s | Ed Clark | I, uh, yes, I'm aware of it. I haven't read it deeply, but I'm aware of it and similar, um, similar kinds of, uh, uh measures at other states too. |
0:41:48.9 | 53.3s | Ingrid Nuttall | Yes. And it's a, it's a set of principles, um, people should check it out. One of the principles is called algorithmic discrimination protections. So this is one of the things that, that, um, is in like this set of principles and guidelines to follow. And I, I pulled a quote from it that said, you should not face discrimination by algorithms and systems. Um, they should be used and designed in an equitable way. So we've talked a lot about kind of leading up to this to kind of getting to design equitable systems, but I'm interested in your thoughts like specifically about How this is balanced by the opportunities presented by AI. Some of it is already what you talked about, which is keeping people at the center, right, like putting humans at the center, but is there anything else there sort of on the like on the algorithm side that you're working with? |
0:42:41.95 | 148.2s | Ed Clark | There's a and, and I mentioned I, I've learned that uh California and Texas and I'm sure other states, right, um, uh, have passed measures in their legis legislatures to to to address this thing, this concept, um, gosh, I'm probably using the wrong word, something along the lines of impactful decisions. When there's really impactful decisions, a human must be at the center, so we're not gonna let. We're not gonna let an algorithm kick someone out of school, right? We're not gonna let an algorithm admit someone into a school, right? So we need a human to be part of that decision. So when you have an impactful decision, don't automate that. I think it's one of the ways and it gets back to what you said, we're gonna keep the humans, you know, it's, it's humans at the middle that work alongside these tools and I think that's a, a key. If if there's something that had a decision of impact, we should not rely on algorithms to make those decisions and I would say that's true of uh everything from home loans to whatever, right? Ideally you would say hey that's an impactful decision to human beings a human needs to be part of making that decision and I that's I, um, so that's part of my hope for the future um and then in terms of discrimination and all those other things I've had. You know, there's a, there's a certain population that I have in, in Cal State just like every like, well, because there's discrimination there I'm not gonna use these tools. I'm not gonna engage in it and you know, and I had um I had a reporter from a, a newspaper, a famous newspaper like, well how do you respond to the fact that these tools have bias, misinformation, do all these other things and you know, and. And, and, you know, so my flippant response in the moment was like, well, have you heard of the internet, right? Like cause, but you know, of course it's full of bias and discrimination and everything else and but but and misinformation. But we can't change something we don't participate in. We all need to participate. Our fa we need people to criticize it, write about it. We need to then get our stakeholders to go to Open AI and say like we saw this happen when we asked for an image generation and you know it spit out these kinds of things. We need to have that conversation, but if we don't participate, we're not having that conversation and it extends the bias and it makes things worse and worse over time. |
0:45:10.63 | 63.9s | Portia LaMarr | Yeah, I think that's what I was just gonna bring up because yeah, AI has had the black community side-eyeing it because these images that was coming out was little, a little, little different from what every day is, but I think people tend to say I don't trust it and I'm not gonna deal with it. And but like you're saying is it's not you. It's OK to not trust it, share why you don't trust it and speak on it. That way it can get better because still at the end of the day, it's people centered, it's people behind it, creating the codes, creating all that to generate the outcome. So it, I think that that is a big thing to say. um and my my other thought is Uh, I mean, I might get to it later, um, but I was just thinking how all of this changed. I mean, we've named some big, big schools, but those smaller schools that may not have the opportunity to be, to have to be introduced to AI. I wonder will that where that leads them in higher ed. |
0:46:15.47 | 44.5s | Ed Clark | I, yeah, I think it's a really good point and I think that when you're part of a smaller school, your voice almost necessarily has to be part of a consortium then like can you join colleagues, can you kind of build that community, you know, that army of the willing, that community of practice, whatever you're trying to achieve, and then and and then that consortium then can um. You know, join or present at, you know, conferences like ̽»¨Â¥ and AACU and all those other things, but there is a way to elevate those, those, um, perspectives as well, and it's, it's by teaming up and like we're lucky as a Cal State University we're already teamed up. We have 23 universities teamed up. That's what I would say the small schools have to do as well. |
0:47:01.37 | 49.7s | Ingrid Nuttall | I love that um that statement. I want to kind of like restate it again, which is that we can't, we can't change something that we can't participate in, but we also can't criticize something that we don't. Participate in because we don't understand it. Like you have to engage in understanding to be. An agent of change. So add in closing, and on a high note, what are some of the most exciting things you are looking forward to exploring next? So, AI empowered CSU, it's a success. You have changed the landscape of education. There are consortiums, everybody has what they need. What else? What's your the next problem you want to solve? |
0:47:51.82 | 49.7s | Ed Clark | I think I, I mentioned earlier, I am, I, I, I told uh I, you know, I, I, I won't name the firm, but I told the firm that like I really, I'm tired of employers. Using elite schools as a proxy for what it means for students, right? And so I, I, uh, what's next? I would love to work as a consortium. I'd love to work with other universities to figure out how can we start to measure. Credential, these durable skills so that all students have that kind of opportunity and it's not just something that's given to the elite, right? And I think that that building that is is the the hope that's the dream of higher ed in the end anyway and I I I I'm looking forward to working with others to address that. |
0:48:42.69 | 21.9s | Ingrid Nuttall | Ed, thank you so much for spending time with us today. It has been um a really, a really powerful, uh, conversation actually. I have been in a lot of, I listened to a lot of stuff about AI and I feel like I've gotten a lot of the same takeaways and this has felt different and dare I say, hopeful. |
0:49:05.40 | 4.6s | Ed Clark | Well, thank you, thank you. It's wonderful to talk to you all. I appreciate the opportunity. |
0:49:18.96 | 15.1s | Ingrid Nuttall | Thanks for listening to another episode of Heard. We'd love to hear from you. Please send us an email at heard@aacrao.org with any feedback you have for us or show ideas. This episode was produced by Doug Mackey. Thanks, Doug. |