The Nonprofit Fix

Shedding Light on OpenAI: Exploring the Power of Nonprofit Hybrid Models

December 09, 2023 Pete York & Ken Berger Season 1 Episode 5
Shedding Light on OpenAI: Exploring the Power of Nonprofit Hybrid Models
The Nonprofit Fix
More Info
The Nonprofit Fix
Shedding Light on OpenAI: Exploring the Power of Nonprofit Hybrid Models
Dec 09, 2023 Season 1 Episode 5
Pete York & Ken Berger

Have you ever wondered if the structure and model of a nonprofit could be reshaped for greater impact? This week on Nonprofit Fix podcast, we bring you an in-depth investigation into the intriguing world of Open AI, a pioneering nonprofit organization known for its outlier structure, funding and mission. We unpack their evolution from a traditional nonprofit to a hybrid model, shedding light on the advantages and complexities of such a transition.

Join us as we dissect OpenAI's unique business model, its adherence to IRS requirements and its potential to scale good in unprecedented ways.  As we navigate these possibilities, we also delve into the role of program-related and mission-related investments in fueling this innovative model. 

Lastly, we pull back the curtain on the role of regulation and enforcement in steering mission and growth in nonprofit organizations. We discuss the potential of converting for-profit businesses with social missions into nonprofits, and the transformational possibilities of hybrid business models. This enlightening journey into the heart of OpenAI and the nonprofit sector promises to challenge your perceptions and inspire new thinking around nonprofit governance and business models. Don't miss out on this engaging conversation!

Show Notes Transcript Chapter Markers

Have you ever wondered if the structure and model of a nonprofit could be reshaped for greater impact? This week on Nonprofit Fix podcast, we bring you an in-depth investigation into the intriguing world of Open AI, a pioneering nonprofit organization known for its outlier structure, funding and mission. We unpack their evolution from a traditional nonprofit to a hybrid model, shedding light on the advantages and complexities of such a transition.

Join us as we dissect OpenAI's unique business model, its adherence to IRS requirements and its potential to scale good in unprecedented ways.  As we navigate these possibilities, we also delve into the role of program-related and mission-related investments in fueling this innovative model. 

Lastly, we pull back the curtain on the role of regulation and enforcement in steering mission and growth in nonprofit organizations. We discuss the potential of converting for-profit businesses with social missions into nonprofits, and the transformational possibilities of hybrid business models. This enlightening journey into the heart of OpenAI and the nonprofit sector promises to challenge your perceptions and inspire new thinking around nonprofit governance and business models. Don't miss out on this engaging conversation!

Peter York:

Welcome to the Nonprofit Fix a podcast about the nonprofit sector where we talk openly and honestly about the many challenges that face the sector where we will discuss current and future solutions to those challenges where we explore how the nonprofit sector can have much more positive impact in the world a podcast where we believe that once we fix the nonprofit sector, we can much more dramatically help to fix our broken world.

Ken Berger:

Hi everybody, ken Berger here with Peter York for yet another episode. Get ready, because this one is really taking a look at some current breaking news kind of stuff that's going on where we think we can possibly learn some lessons that many organizations could consider, and so we're going to start off with Peter explaining the specifics of the story that we want to talk to you about.

Peter York:

Great welcome everybody. Thank you, Ken. So what we're going to do today on this episode is we're going to talk about the most recent news story around open AI, and we all know that it wasn't that long ago my goodness, it was probably over the Thanksgiving break where we had the firing of Sam Altman, the CEO, and then the coming back of Sam to run open AI. The reason we thought this would be a very interesting topic is because of the structure of open AI as a nonprofit, and there have been a couple articles and Ken and I have been doing our research.

Ken Berger:

There's a couple good YouTube videos that are sharing some great information and insights about the way open AI is structured and by the way, just for the general audience that may not know this, peter, so this is the organization that has become famous because of it creating chat, gpt.

Peter York:

just to be clear, yeah, amongst many other things, I will and I'll share with you what the mission of open AI is. This episode we thought it would be a good idea to share more about the nuts and bolts and guts of the organizational structure, because it's a very interesting organization that began as a nonprofit and then I'm going to share a little bit about its history and how it's evolved, and then we're going to talk about what the implications are and have an interesting, maybe even back and forth around. What is this model? What are the implications of this particular model for how we think about the nonprofit sector? Many of you will recall that I have pointed out in a couple episodes previously this desire to think about the nonprofit sector and how we can grow it. To see this open AI model is one model that I find intriguing, and how could it expand to others. But let's just first lay out open AI and what it's all about. We found an article on Medium. The author is Brandon Jinn. It's a November 18th Medium article and we'll put the link in it. But rather than reinventing it, we thought this was the best article there was that kind of summarized this. So, in the tradition of some other podcasts we listened to, I thought let's just read this article from Brandon Jinn and be able to share what he's saying about open AI structure. So the title of the article is how Open AI's Corporate Structure Evolved Since 2015.

Peter York:

Open AI's structural evolution reflects a dynamic approach to achieving its mission of developing safe, beneficial artificial general intelligence for humanity. That's its nonprofit mission. That's how it started. Here's how it evolved in the past eight years. The Genesis a nonprofit vision, and it started in 2015. Open AI's journey began in late 2015 when it clear an ambitious objective to develop AGI, or artificial general intelligence that is both safe and beneficial for humanity.

Peter York:

The founders, understanding the magnitude of this endeavor, initially set up Open AI as a nonprofit organization, free from the constraints of for-profit driven motives. Open AI launched with the goal of a billion dollars in donation commitments, but faced the reality and this is part of the key that's central to this conversation. It faced the reality of high capital intensity. That means that you had to raise a lot of money and you had to do so through the nonprofit, just as a nonprofit. Over time, the nonprofit received approximately $130.5 million in total donations funding its initial work in deep learning, safety and alignment. Alignment is really speaking in the AI world about trying to align the AI with human values. So it was really about ready to take off and you can hear there was a capital issue needing to raise more capital and there was a transition in its model, its pure nonprofit model, into a hybrid model in 2019, so four years later. So, despite the noble intent of starting it as a nonprofit, it became apparent that the reliance on just donations wouldn't suffice for escalating costs of computational power. Can we?

Ken Berger:

pause there, sure. So I think we really want to highlight this, because one of the critiques that we've heard from a number of experts and we might even have another podcast related to this coming soon is that one of the fundamental weaknesses of all nonprofits, at least of a certain kind, is that, basically, you have to go if you will begging for money, and that the amount of money that you can raise in the traditional nonprofit model as charity really is limited, and if you have some desire to truly scale the organization, you're not going to be able to do it with the traditional tools of a nonprofit. So this thing that they're confronting here is fundamental challenge, and so now, next, I think, where Peter's going is. This is the way they went about trying to solve that fundamental challenge.

Peter York:

Now, along those lines, let me just reiterate a point I just made as well, in response to what Ken just shared they did raise $130.5 million as a nonprofit, which is amazing, that is amazing, that's a mega and that wasn't enough in a particular area like AGI and AI, because they realized they needed a billion dollars.

Ken Berger:

But I would just say to start, but what I would just say to not to in any way minimize your point, but just to add a bit of nuance depending on the kind of nonprofit you are, the kind of even nonprofit charitable giving you can receive can largely be driven by type. So if you are a tech IT company like this, on cutting edge, you may have access to tech billionaires for charitable donations that a homeless organization down the street doesn't have access to, and I'm not saying that is an amazing accomplishment For a new organization, that's amazing. But I do think we need to also realize that tech billionaires or that group might come at this where they might not normally donate to some of the other organizations we typically think about.

Peter York:

Well, you're speaking to ROI. At the end of the day, we're talking about a return on investment and this was clearly structured as a nonprofit to keep its mission front and center. But obviously it's a particular sector and a slice of a type of company, if you will, or organization, where there's a lot of return on an investment. As we're seeing, as we're seeing, they're still not profitable, from what I understand, but they, you know, neither were many of the other big tech companies. It's just interesting to see how something like the tech space could get, could get led by a nonprofit. It still is a nonprofit, oh, yes. So again, despite the noble intent and now I'm reading again, please, please, make sure you all know Brandon Jyn wrote this article, which we just found to be very accessible and easy Despite the noble intent, it became apparent that the reliance on just donations wouldn't suffice.

Peter York:

So, back to that point. Hence, in 2019, OpenAI innovated a unique structure introducing and I'm putting this in quotes capped profit, a capped profit arm that, while retaining its nonprofit core. This dual structure allowed OpenAI to preserve the core mission and governance of the nonprofit while raising capital through the for profit subsidiary. The nonprofit remained the overarching governing body, so it had control over the subsidiary, control, voting control over it. However, that subsidiary just so this is not, this is me saying this was was 51% owned by the nonprofit. So it did allow for investors and employees and others to be able to have resources, investment, investment, I should say shares, if you will. In the for profit subsidiary, the nonprofit remained the overarching governing body. The nonprofit entity retains full control over the for profit subsidiary, ensuring that every decision aligns with the mission of a safe and beneficial AGI.

Peter York:

The hard caps on profits to investors and employees also ensures that the excess value benefits humanity. But we do want to say something along these lines, and this is me me talking now. The hard cap was a 100 X return. We did a little bit of research and for some of the biggest unicorn companies out there in history, 100X, as far as we could find, is unheard of in terms of the. So the fact that it capped at 100X it's very important to note. Even some of the biggest unicorn companies did not return to their initial investors 100X return.

Ken Berger:

Yeah, if I could add to that. So I think what we saw was the most profitable companies that we're aware of for-profit companies. 30x is like at the upper limit. So to create something with 100X is essentially, as far as I'm concerned, completely bogus and it's, just as far as I could see, essentially a workaround to comply with requirements to be a nonprofit, but also putting it in place. So they're technically within the law and the regulations, but in reality it's bogus and it's just signaling to comply with the law and not really anything more than that.

Peter York:

Right, correct. So the last points are this is that OpenAI's board, comprising a mix of employees and non-employees, is the governing body that anchors the organization in its mission. This diverse group of leaders with varied expertise ensures that OpenAI stays true to its founding principles while navigating the complexities of developing AGI. So that's the end of that article. I just want to add a few other points on this so that we bring this conversation all the context that it needs. When we talk about this type of a model, some of you may be thinking is this even legal? And the answer is yes, if it follows certain IRS rules. So the IRS rules for a for-profit subsidiary that is owned by a nonprofit. They're designed to prevent fraud and to ensure that the mission is not compromised. There's some key points that we want to highlight in this conversation that are important, and these key points are kind of the rules of the road, so to speak. This is what nonprofits really need to follow as rules.

Peter York:

The nonprofit one is that the IRS does not allow the nonprofit the for-profit transition. The IRS does not allow nonprofits to simply transform into for-profit entities. That's why the OpenAI still maintains its nonprofit status and the nonprofit is still controlling the for-profit subsidiary. This is to prevent fraud. So you just can't take a nonprofit and convert it into a for-profit. There's a lot of reasons for that. You just start as a nonprofit, raise a lot of capital, then turn for-profit. It just becomes a game of what are they called the shells? You're just moving the shells around.

Peter York:

Second, ownership and control. The nonprofit entity can retain ownership and control over the for-profit subsidiary. In the case of OpenAI, the nonprofit entity OpenAI Inc retained ownership of OpenAI trademark and full control over the subsidiary. Control means votes, by the way, it doesn't necessarily mean ownership. Ownership is speaking to the percentage of the equity, if you will, or the shares, the actual investment capital Control speaks to who makes decisions. That's an important distinction. There, three, we also have to note that there's another rule and that is the unrelated business income rule. So the for-profit subsidiary of a nonprofit can generate what the IRS calls unrelated business income. It's just that it has to be taxable. This would be anything that doesn't fit in the mission. By the way, when you think about OpenAI, all the stuff that it's doing so far, it can be claimed under the mission. So it is not unrelated business income.

Ken Berger:

So the two. I think that what I've heard about this, the two big pillars of what I would call workaround or ways to comply with the requirements of the IRS one was the whole issue of cap for-profit and the other is making absolutely clear that there's total alignment in the mission as a nonprofit and a for-profit. Those were the two overarching core things that helped them get past private enurement and the related fiscal responsibility and whatnot, those kind of barriers that they might have confronted unrelated business income. All of that stuff was by saying, well, it's the same mission, it's the exact same noble mission as the nonprofit.

Peter York:

Yeah, and I'm not sure if you're intending a skepticism in your tone, but to me, when I see what they're doing and, by the way, they are doing a lot of research on the R&D and its mission, if you remember, is to strive to actually bring AGI, cig safely and all of its benefits to society, so they've cleared the hurdle, at least from the IRS standpoint, I'm sure. With that scale, by the way, and billions of dollars now going into open AI, they've got lots of lawyers making sure they're not breaking any rules from the IRS standpoint. So the next thing to mention is-.

Ken Berger:

Well, I guess just to respond to your question about my tone is I think that you can have lawyers that get you to comply with the technical requirements of the law, and that to me is not the same as having your lawyers make sure that you comply with the law but that you truly are doing it with integrity. And that whole 100 times capped profit to me doesn't pass that second hurdle. As to the mission part, listen, it seems like there's a lot of very exciting stuff going on and no one is perfect, but at least on that score, for me it doesn't really pass that second hurdle in terms of really meaningful- Well, that's actually getting to a separate, maybe a separate point.

Peter York:

So let me get there, because the UBI piece is the point I'm raising that we could debate. It's debatable if they pass the UBI test, the unrelated business income test, with all of their line, all of their subscription services, all that At the same time. You can get into interesting debates there, the fact that that's debatable. But it does seem in line with their mission and it seems they are passing the UBI test, at least from the IRS standpoint. We can debate whether that's true from an ethical standpoint. There's ethical and legal, so we could talk about what that means. But let me keep going. I want to ask you.

Ken Berger:

But so let's just play it out another way. Let's suppose I form a nonprofit that's called Good Teeth for Everyone, just Goodness for Everyone, and the mission is to help humanity to get better. That's the mission. You could drive a truck through that mission and you could do virtually anything and create a for-profit underneath. If my for-profit is a toothpaste company or my for-profit is a whatever fill in the blank, you could make an argument that the mission of that for-profit aligns with the non-profit mission because it's so broad that you could drive a truck through it Again. I think one of the problems is that I've told you before about these crazy non-profits that do all kinds of things, and I think the IRS is very reluctant to put any kind of boundaries. The boundaries of non-profits is almost limitless, from ghostbusters to pole dancing, breast-dressing nuns. They're not making that stuff up. So that's why I just want to say it's sort of. You know you could-.

Peter York:

I understand. I understand, and part of the challenge in the non-profit sector, in all candor, is the fact that what you're describing requires some type of a body that can regulate, test, transparently, evaluate and see what's going on. See, open AI is in this circumstance, where I think they're not the examples you're seeing. I think they can make a very deliberate case and they have to be able to be more transparent, because they're so huge. They're a whale in the ocean with no other whales, that's true, and so, at the end of the day, I get your point and there definitely are a lot of non-profits, but they're under the radar and they're sneaking by and there's not a lot of good ability to challenge it. So what has to happen with what you're describing is there has to be somebody out there who's policing that right. There's got to be some kind of way to essentially say this is a bogus one. We're going to file a claim that they're not actually doing what they say they are doing. We have to have some kind of force out there. So, in a lot of ways, there's the laws. Just like you can pass laws, but if you do not have any enforcers, you're not going to have to do that the laws become meaningless. And so what we're talking here at this point and open AI is interesting, but, ken, you're going in the direction I wanna have this conversation around, because, to me, the conversation that I think is the most interesting is what are the implications for open AI as a business model? Not for open AI, but as a nonprofit, a way of scaling good that we've never seen before. And so, but before we go there, let me make the other points so that we have all the rules of the road for our conversation. So the next thing is that nonprofits have to have prudent management of institutional funds. What that means is they have to have a legal duty to manage their finances responsibly. Okay, the UPMIF Act, which is the Uniform Prudent Management of Institutional Funds Act, say that five times fast. It typically requires nonprofits to diversify their investments to minimize risk. However, this does not apply if the investment furthers the nonprofit's mission. There you go. So again, yes, there you go.

Peter York:

Third, private enurement. This is the other one. There's a concern of quote private enurement where nonprofit-born members or employees gain unfairly from their positions. This is where I know you need to chime in, but just give me. Let me finish this point. In open AI scenario, potential conflicts arise from using nonprofit-funded research to benefit a for-profit subsidiary partially owned by employees, and now we know Microsoft and others. In another way it's gotten even more complicated. However, at opening, either legal team did ensure compliance and looked at this, according to all the news media. But that private enurement piece is one where Ken and I both raised our eyebrows around the 100X because in a lot of ways, when we think about it, at what point is it unfair or too much? They say employees, but let's broaden it to investors. At what point is it too much? So the private enurement one feels to me like one that the 100X cap is something to note.

Peter York:

I will mention a couple of other things really quickly, which is the hybrid governance model. It's really important to know is that the nonprofit board it does fully control that subsidiary. That is why, in the recent story, sam was able to be fired. But there's a lot of people in the nonprofit sector that are making the point. The fact that he came back proves that it's not working, because basically what that means is that they fired him because at first they thought it was because of concerns that he was trying to be an accelerationist. He was trying to move AI so fast that it was unsafe and so they didn't want him there.

Peter York:

That's the good of the nonprofit. It should have been that way. But what they're summer saying is now they brought him back because all the employees were upset about their investment returns and Microsoft was upset and everybody was upset, so he came back. So all of a sudden we go really, does that governance model really work in this context? Again, I'm gonna draw attention to just how humongous they are as a whale. So I don't know that. They're a good example for a conversation about the generalizability of this model for others, but at the same time, it's important to bring that context up. So go again.

Ken Berger:

Yeah, so I just wanna I mean I told you my thoughts about the 100 times capped profit, but I think, for listeners, there are a fair number of nonprofits out there that have a for-profit subsidiary. In fact, where I'm working now, where we've talked about it, we're exploring it yet again and I've got another organization I'm involved with. So it's something that is frequently done and cognizant of all the rules that Peter mentioned. But I think that one of the a couple of quick challenges that I think typically happen with those for-profit subsidiaries is number one. There's oftentimes a lack of expertise To run that. You've got your nonprofit model and to be able to manage that is a challenge. And then also just the bandwidth. Your typical leadership staff are very thin running around and now you've got this whole different entity with a different model and different challenges, and it can be a difficult thing to do. But there are situations and there are other kind of hybrid models that people use, but my experience is typically they don't make that much money.

Ken Berger:

I think the thing that really differentiates this one in this story is the fact that you then have this holding company and then underneath that holding company, you've got a place where people can put in boat loads of money, and so it's sort of like an unusual structure where you can have like a large number of big investors in that hapt profit, because it sits below a number of layers which we don't want to get too far into.

Ken Berger:

But the fact that it's structured that way allows capital to come in that's different than the typical for profit that sits under a nonprofit in those that you might typically see out there, and so that's something that is unique and differentiates it. And I guess I think this is where Peter and I have when we look at this challenge for nonprofits. It's like is there something here, with all of the questions and the issues and the weirdness of being fired and coming back a week later and all that stuff? Is there something here for this challenge, this fundamental challenge of raising capital, that could be replicated? Is there a takeaway here that all these super expensive lawyers and tech people have created that could be a learning opportunity for other nonprofits to generate capital? It's a question that I have in my mind from listening to this.

Peter York:

So, along those lines, let's remove this from the extreme outlier of open AI, the whale and let's now move it into a scenario and let me give you a scenario, ken, and then let's talk about this back and forth. So let's say we wanted to apply this type of a model to begin to address some of the things that I've been talking about on our episode, which is like I do wish the nonprofit were more than 10%. I do feel like the profit motive is part of the problem with so much that's going on in the world. That includes the motive of profit that is driving a lot of our social media. That's driving our polarity within the, although our coordinating off and fighting each other and everything else. It's all attention, ads, money, profit. It's the profit motive. So, just hypothetically and I'm not gonna go, this is not the scenario I'm giving but I just wanna make a point.

Peter York:

I've often wondered things like there are certain things like social media, like if you take Facebook and you said we wanted to be able to platform for people to be able to connect with each other. People have known each other for a long time, reconnect with each other from their childhood, be able to share their stories, their families what's going on. But yet what if you put that under a nonprofit and it had a mission of just connecting people, reconnecting people, long lost people, whatever it is, and that was its mission. And instead of a 100X cap, you had a subsidiary that was owned by it? It was 51% by law, 51% owned by the nonprofit. It was 49%. You could take on investors. Those investors could take that 49% and invest in this organization. And let's say, we capped it instead of 100X. We capped it at 10X, okay, or 20X, worst case right, and so you allowed those investments to come in. Let's say, on top of it all, you were able to bring in program-related investments and mission-related investments, which what that means is taking the dollars that are not the grant dollars from Philanthropies all out there, but the corpus, which is where most the money's sitting, and be able to use those for loans and investments in this particular company.

Peter York:

Now, imagine if we had started a company like Facebook that had all the same features and tools and everything else, but the difference was it wasn't driven by motive, it was capped for profit, it was fully controlled by the nonprofit governance model, so they could fire and do all that kind of stuff we don't have the. They're not at the scale of open AI, so they're not about to take the CEO back because it's not like the whole world, including Microsoft, are screaming and yelling. So if you take that as an example, or take even a more practical nonprofit example, and you got a 5149 split fully governed and decided upon by the nonprofit board, you could take on investors. I like the idea of being able to bring in finally letting loose. What I think is still kind of tight is the PRI and MRI money that is sitting in the corpus of all of these grant makers. Start leveraging that dollars better. What could we do in the nonprofit sector with this model?

Ken Berger:

Yeah, so it's a question, so, but there's a lot of ifs there and a lot of I think. The short answer I have to that is I think there needs to be a lot more experimentation with tweaks to what we see here to see if other models that have some of these changes you've described are workable. So, for example, when you talk about making it 10x rather than 100x question, you test the model and you say, well, will that draw the same number of shareholders and investors? Will that be attractive enough for them? If the return is a mere 10x, will that work? Another thing is how does this, how, what is, if anything is there different about this model of like a 10x that differentiates it from a B corporation? You know, b corporations have the same notion of having a certain limit on the amount of profit, but they are not.

Peter York:

They have the notion. I don't know how legally obligated they have to fulfill that notion. I don't feel that.

Ken Berger:

And I think that also depends, like there are different kinds. There's B-corps that are they're B-corps in mission and statement, to your point, and then there are those that are certified B-corps, that have to comply in real, in the real world, with certain commitments, which is a much higher bar, and many B-corps don't do that and a lot of people don't realize that. So they think they're getting X, when it they might be getting Y. You know, the other thing that strikes me with this and with the challenges of this and again I'm not saying we shouldn't be experimenting and trying, and I agree with you that there may be some, some something here but it, you know, goes back to one of our earlier episodes for me, when I talked about that traditionally, at the high level, the differentiator between the for-profit and the nonprofit sector is a question of mission. Where for-profits mission is profit and presumably nonprofit's mission is is to meet the mission, to serve the mission of the organization. That that's their motive, rather. So it's profit motive or mission motive.

Ken Berger:

But in the but, let me, let me finish the thought, but the but, the challenge that I noted there is that for many nonprofits, in spite of this problem of getting capital and scaling.

Ken Berger:

Nonetheless, a driver that sometimes causes mission drift and for organizations to lose their way is the growth motive as opposed to the mission motive, and so it seems to me like part of the story here that that we're seeing is that, you know, when there's this kind of humongous amounts of money and growth, growth, growth, on the one hand you say, well, of course we need to have that growth because this is about, you know, the artificial intelligence being, you know, promulgated and spread, and the and the research and technology that's required. But at the same time, there is a question as to well, when is there an adequate amount of funds here and when are we, when are we moving towards mission drift? And I think that may have been part of what you were talking about there when you said that the board. One of the theories as to why the board fired Sam Altman was because they felt that the mission that was supposed to be, they were concerned, were going too fast, like there's two.

Peter York:

I do want to mention that the current news media coverage on this and I'm an avid AI listener on a daily basis has come to the conclusion that it's less likely that that was the original cause. It was a lot of speculation in the media, but I just want to.

Ken Berger:

Well, did you have a sense of what from your readings?

Peter York:

do you have a sense of no, there still is a lot of lack of clarity on this. I still reserve a part of me that believes it was the safety issue. There are two camps in the AI space. They're what's called the accelerationists, and they are the ones that are really pushing to advance AI as fast as possible. And I think, whether we like it or not, it's already happening because there's a race. There's a race between companies, there's a race between countries. You know, china is going to drive us, everybody's driving us, so it is accelerating, no matter what.

Ken Berger:

Yeah.

Peter York:

And I think the mission of open AI is actually right.

Peter York:

We've got to be really thoughtful about safety, and I do believe that there were board members who were basically going whoa, wait a minute.

Peter York:

Even though the media is saying it really wasn't, they're coming to the conclusion it wasn't.

Peter York:

I'm still skeptical because I believe that there were those and there's the under the umbrella of effective altruism, one of the big areas of existential threat that the effective altruists believe, and they are a powerful force in many fields, including philanthropy, where their perception of one of the biggest existential threats in addition to the environment, climate change one of them is AI.

Peter York:

And there are effective altruists that are on the board, that are part of the employees, that are a part of open AI, and so I think this was the accelerationists versus the existential threat, and I think that there's some dimension of that and I think it's being downplayed and I still think that that's a real thing. To me, the issue with open AI that I think is wrong, in my opinion, is the fact that the market pretty much drove them taking Sam back, and I'm not saying he shouldn't have come back, I'm saying the fact that it was pretty much driven. The cause of him coming back was basically Microsoft and everybody else and all the staff, the 500 out of 700 that were going to mass exit, which, by the way, they are shareholders all that.

Ken Berger:

But that's what I wanted to ask you. Yeah, so one takeaway here if we assume let's just say if we assume that in a better functioning nonprofit, if the board it's funny, I'm saying this as I'm thinking about it it's ironic, but if a nonprofit board terminates a CEO, whether they're doing it wisely or stupidly and I've seen lots of stupid ones, I'm an authority on this but when they do that, it should stick because they are supposed to be bosses. So one of the tweaks that might be considered here is if you're an employee, you can't be an investor because you have conflict of interest If you, on the one hand, are a shareholder and then, on the other hand, you're an employee and you see the guy that's generating big bucks for you. So that might be one of the tweaks.

Peter York:

It might be one of the tweaks, I'm not sure. There's a number of ways of solving this problem. So, for example, openai taught us something right that in this model, that presupposes a primacy of mission. In fact, if the the ones who were worried about the safety of this happening too fast Pushed them out and then the markets basically brought them back, then there are other ways to do it. You can either say, yeah, employees can't get shares, which I'm not sure I feel strongly about. I'm spit-boiling, I had never. However, however, however, if they don't get shares, the point being is that I Putting some rules in that if the board does fire the CEO, they, you know there's a legal repercussion. If you take them back and you find out it was market-driven, I mean you can. These are lessons we can learn from the kind of and that's my point part of what I'm trying to get at is, any Of these models don't just require laws.

Peter York:

They require, like, regulation and they require a way to enforce. Let me just make one other point that you talked about, which I just you were talking about the growth Orientation I'm very agnostic on. Is growth good? Is the growth not?

Ken Berger:

you know, we've had some discussions about this and, but, wait, but, but, but.

Peter York:

The point is that, when it comes down to it, I just want to clarify something Profit is not the end. What profit is it? In the for-profit world? Profit is something that ends up driving like the happiness of the shareholders. They get to bank that money, so to speak. So in essence, we often talk about it as an end. It's all about the profit. Well, really, what it is all about is the shareholders getting their reward, okay, their return on their investment. I should say right, that's, that's what profit is in the for-profit model. I think it's a real problem in the nonprofit sector that we keep saying if they're not about profit, they're about mission. That's not true. They're about profit to drive mission. So the profit has to come back into the organization in order to drive the mission further. So there's nothing wrong if that Dollars comes back for research, development, anything that's enhancing moving along the, the, the mission of the organization.

Ken Berger:

So to say they don't make profit. They make a positive variance.

Peter York:

I guess, or a surplus. I'm not talking literally. Financially, I agree. No, I'm not talking literally, but my point being is that there's a surplus in both business models. It's a question of what does that surplus do? What can it do?

Ken Berger:

Where does it go? Where does it go? Yeah, exactly.

Peter York:

So your point is very much on on target there. I just think that as we look at this model, it does bear deeper discussion and Ideation around how we should start to think of it, because I think the nonprofit sector is still stagnating. You could talk about growth all you want, but even the larger organizations, I don't know that they're really functioning at a level where they've advanced mission to your point. They've just grown because they become the, the, the whales. Exactly that case. It's the whales, but at least, at least, and we really do struggle a little bit with respect to getting that, that mission, so accomplishment let me, let me.

Ken Berger:

Let me respond to one place where you and I, you know, have perhaps somewhat different views. You know you talked about how it would require regulation versus law, and so not versus. It's very important that we have a bullet regulation.

Peter York:

Let's just say you can't have laws without you about.

Ken Berger:

Let's just stick with the regulation for a moment. So so here's what I have observed. This is what I know over the years and it actually had really has really shaped my thinking over the years and and I can't tell you if I had a nickel for every time my colleagues, whatever their political stripes, have Express frustration here that typically government, whatever bureaucracy I've worked with, at least on the state level and to some degree on the federal level their ability to enforce regulations, to have thoughtful regulations that are appropriate and targeted and lasered in, in General is a big problem. They don't have the technical knowledge, they don't have the staffing, they don't have and you know, you have your, you know what was called the military industrial complex and maybe someday we can talk about the nonprofit industrial complex and people moving around the system the lack of institutional memory in government, and so typically, government is often behind the curve on these things wait, wait, wait, not on, tech, not on tech.

Ken Berger:

Let me finish. Let me finish what I have Read and you're you, I'm gonna turn it over to you. You clearly have a different point of view, but everything that I've understood about where we are when it comes to technology and where our federal government is and I don't I Can't even speak to state government but my impression from what I've read is that that problem of playing catch-up in regulatory environment is Even worse the more complex and more advanced and cutting-edge things are and when it comes to chat, gpt and ai in general, I, from what I've read, you've got an even bigger problem when it comes to government regulation playing catch-up. So you're saying no.

Peter York:

No, I'm not saying no. I'm not saying no that we haven't had a problem with regulation. The problem with it is because it never goes that far. We pass the law and spend the money and we actually don't get right our legislators, to actually build in regulation, because everybody says oh no, no, no, now that we've gotten what we want, now that we've gotten the money, because, by the way, government is funded the heck out of technology, the internet, everything else. I mean ai is actually the first technology that didn't come out of government. Okay, when it came down to it, all the big tech stuff came out of the military. Yeah, came out of other things.

Peter York:

So sure so the idea that government is not gonna, you know, do that kind of stuff, I get what? You're saying no, I know about no, I know.

Peter York:

I know, but what I'm saying is the regulatory problem Is not a problem of government doesn't know how to regulate. The problem is it doesn't regulate. And the reason it doesn't regulate is because the Industries that want to be untethered and and not have all these breaks in the system have really managed to make sure that they aren't. And so it's called I, would, I, would, I would argue that I don't, I, I don't think we've ever had teeth because, well, in all candor, our legislators Don't want to actually invest in, in, in the teeth it's.

Ken Berger:

You know there's a term of that's used called regulatory capture. Regulatory capture, and what I understand that to mean is that these larger corporations are more than happy to have a regulatory environment, because they capture the regulations, they influence. They are the experts that government turns to to help create the regulations, and then what happens next is that it becomes a barrier to entry for smaller Start-up companies, because these big whales, as we call them, are basically capturing whatever regulations there are. So it's not just the legislature. And then get no, of course, of course. And then who pays? Who pays for the never-ending fundraising of those legislators and to get them into office, who has the money to bankroll? So it's a, it's a, it's a.

Peter York:

Cycle there, I hear you, I hear you. Political, I bet, let me, let me. Let me. Let me come back to something that I want to mention before we Wrap up here. We've got about eight minutes. I just want to point out something it's not okay to convert a Non-profit into a for-profit. That screams fraud.

Peter York:

Yeah what I'm going is this what if we reversed it? Why couldn't we start to think about all those for-profit businesses that have a social mission, convert them to nonprofits and keep the for-profit part as a subsidiary 49% take on investors so that the investors can still do their work? I think there are a lot of for-profit models, and I was talking about social media. I'm talking about things that actually do or have the potential to help people, and the reason they don't. I think it's because of the profit motive, if you.

Peter York:

So what's the world? What could the world be like if both nonprofits could start to think creatively about adopting this model, and if for-profits that are in a business where they really are trying to do social good, if they were able to form a nonprofit and then bring on investors in a different way. And again, I'm gonna say something here that I just think is really where I'm very interested in. I think there's a lot of dollars sitting in the corpus that is being invested in philanthropy. It's like if the payout is 5%, there's a whole lot of money in that other side of the investment. It's going into the stock market, it's going into other things.

Ken Berger:

What could happen? The 5% you're talking about is foundation's requirement to the payout.

Peter York:

yeah, so they make, but all the rest is investment. They get a return on that investment. They grow their corpus, oftentimes as they do better than 5% in the market. But the point is, what could happen with that other? And there's plenty of energy and force going into taking program-related investments. They can take their corpus and they could lend companies, and those companies could be owned by nonprofits.

Peter York:

The research and development see the opportunity for the nonprofit pieces, a lot of the R&D that is for good, these companies that are trying to do good. They were in a nonprofit governance model and the profit motive was therefore not eliminated, but it was controlled right and the governance of the mission was there. And again, we have to have teeth. I completely agree with you. If you're toothless it doesn't matter. You have to have regulatory teeth. But at the end of the day, where could we go?

Peter York:

I think there are a lot of for-profit businesses that are literally kind of sitting on the mission of social impact side, but they formed as a for-profit and hoped they'd raise more money, but they're not open AI. They don't have the transformational stuff like that, and so at the end of the day, they're not able to take their stuff to scale because they're sitting in this very weird fence place of like we're a for-profit for good, right, we're trying to do good in the world, and they're straddling this. And I actually think that business model is difficult to invest in too and I just think what could we do? We need to begin to explore and how could we start to leverage the one thing the nonprofit sector has. It is these laws. It is the laws that we were just talking about.

Peter York:

That basically says, if you are a nonprofit, you do have to have a certain ownership and control. You have to really pay attention to your unrelated business income. You can't go off the rails and just make money for your investors just because you wanna line their pockets. You have the prudent management of institutional funds. You have to follow the private and normant rules. So if you had that but you still had the ability to make money and attract investors, especially philanthropic corpus investors, pris and MRIs, how could we start to transform a lot of the work nonprofit and for-profit in these companies in a different way? Yeah, that's where I'm excited by. I'm like this is an interesting idea.

Ken Berger:

I think there. I still do believe that they have been so successful with all of the problems that we were talking about and all the questions there are. They're still. I still at the end of this, to your point. I think that there could be something here that could help with this fundamental challenge that nonprofits face as to it being a more noble quest to have a nonprofit that sits on top of for-profits because of the problems with for-profits. I think it depends, but I'm certainly open to there being more experimentation in this area and I think we should continue to explore these hybrid models in our future episodes.

Peter York:

I completely agree. So this was a fun discussion. It's another episode of the nonprofit fix. Thank you everybody for joining us Again. Do you wanna say any closing words?

Ken Berger:

Well, no, I guess, yeah, I guess the only thing to say is you know, the two of us are learning as we do these episodes, and so we. I think we consciously tried to mix it up a bit more this time, and we hope you found it entertaining and educational and that you listen to us again after this. By the way, the iron, last iron ironic thing is we do this, we did this one on Zoom, and I've got R2D2 and C3PO behind me, so presumably I'm all into the tech world, but the real expert has got a tree behind him and he's the tech guy by far. So we're both trying to balance out our existence. Yeah, anyway.

Peter York:

Well, this is a great. I think this is a good one. It'll definitely hopefully spark some conversation to out there, and I do think it's worthwhile for the nonprofit sector to begin to at least expand how we think and ideate around the business models, because I don't know. One thing I can draw a conclusion of is it's not working very well, and it hasn't for a very long time, no argument for me.

Peter York:

And so we've got to figure out something, and so OpenAI has got its problems. Take the outlier element and see what lessons there are at a more micro level, and, who knows, maybe it's another good idea. Thank you, ken.

Ken Berger:

Thank you, peter, this has been great.

Peter York:

And we'll go there.

OpenAI's Nonprofit Business Model
Nonprofit Compliance and Workarounds
Exploring a Nonprofit Governance Model
Exploring Regulation and Profit for Nonprofits
Exploring Nonprofit-for-Profit Business Models
Exploring Hybrid Models for Nonprofits