CENSORSHIP, SOCIAL MEDIA & FAILED JOURNALISM
February 28, 2022
S01 - E01
RSnake and Raymond discuss the topic of censorship and some of the lessons learned from social media mishaps. Raymond also discusses how ibble will be shaped by the policy failures of other companies, like Facebook and Twitter.
Today I'm interviewing Raymond Kaminski; an entrepreneur, a foodie, and a very thoughtful person. He started his career at NASA eventually moving into finance at Honest Dollar where I met him before Honest Dollar was acquired by Goldman Sachs.
Now he's the CEO of ibble, a platform which began life educating traders and has since pivoted to become a video-based social media platform. Sometimes it does take a rocket scientist to solve the innumerable ethical issues with social media.
We get into the uncomfortable issues with social and censorship, and I think we even managed to stay friends after it's all said and done. Without further ado, I hope you enjoy getting to know Raymond Kaminski.
Hello, Mr. Kaminski. How are you?
Robert, good seeing you.
Today I have with me Raymond Kaminski, the CEO of ibble. He's my premier interlocutor and this is quite an endeavor for both of us. I really appreciate you coming on.
Thank you for having me.
Well, I think there are a lot of interesting reasons that we have to talk and historically have talked about. Unfortunately, most people don't get to see what we get to talk about. This is an interesting way to open up and share a little bit about what kinds of conversations we're going to have.
No bourbon this time so it's going to be slightly different.
Well, I might have had some bourbon. You never know. There are a couple of reasons that I think it'd be useful to talk to you in particular. Number one, we're friends and colleagues, and we've known each other for quite a while and that's a good place to start. We have a common way of talking about things, which is good.
Number two, this is your studio and we're going to talk a little bit about ibble, probably a lot more in a little bit. But also, you were the first person who encouraged me to do a podcast. Do you remember that?
I don't, but I really appreciate you doing it.
It actually took a while for me to fully grok how important that would be. For a long time, it just kept it on the back burner in the back of my head, and someday, I might actually pull the trigger on it. And here we are.
I think what's really important is there are so many great conversations that happen around the world and this is a great way to document them. I had one of my friends, Andy, tell me that something magical happens when you're in the studio. Being around one of the oldest buildings in Austin and hearing your voice is like a dancer dancing in front of a mirror.
There are amazing conversations that just flow in this room. I'm excited you decided to do it. I'm really excited you decided to do it here.
I could not be happier that I was able to make this all work out. A lot went into this as I'm sure you have some idea.
One of the major reasons or the primary reason I think that you're a great first guest is because this is a podcast. And in podcasts, we're talking about things. Those things can be controversial, or they could be totally benign. But I think it's important that we are able to have these kinds of conversations.
As the CEO of ibble, which is a brand-new social media platform and I'd love to get your take on how you describe it, you get the benefit of hindsight. You get to see what happened with Facebook and Reddit and Twitter and all these different social media platforms that have been beleaguered by some First Amendment issues, whether it's truly first amendment or people's perception of it.
I think censorship — whether it's government-sponsored censorship, or just some social media company deciding what they want on their platform, from a trust and safety perspective — there is certainly a chilling effect that's happening and a lot of people are feeling it. It's certainly front and center geopolitically and socially in general.
I'd like to start off by having this very first podcast be about censorship, how you're going to affect that change of all this prior knowledge you've had from all these different social media platforms, and how that's going to affect ibble.
To kind of unpack what you said, when most of us go away to school — high school, college, vocational school — the time that we're in school is the same amount of time. It never changes. High school is four years, college is four or five years, vocational six months, a year, two years.
What happens is time is constantly changing and we're learning new things. So we have to compress down a lot of knowledge in a short amount of time. What's really good with what's happened with a lot of social media platforms is we've watched years unpack, we've watched growth, we've watched mistakes happen.
What we get to do is watch from the sidelines and say, how do we reinvent all this? How do we start fresh and not be a platform focused on a feature, but a platform focused on reinventing social media? It's been really exciting for us so far.
Why don't you spend a few minutes telling us about ibble, and specifically, how you got here from ibble's perspective? Why ibble? Why now?
Actually, we didn't start as a social media platform. We started out as a finance app but with the same core knowledge. We were focusing on education. I thought it was going to be powerful to let people that are intelligent about a subject matter share stuff that they're passionate about.
So we started with publicly traded or even private companies of what people are really excited about, and when we were a finance app, stopped focusing on, "Here's the graph, here's fundamentals, and here's how it's trading a certain way," and more of what made a person excited about it.
Because I think you can find anybody passionate about something and when they're passionate about something, it comes out of how they speak about it. So, we started with that core idea of democratizing education, making conversations accessible. What we started realizing is a lot of people were sharing stuff that wasn't about finance on a finance app.
Can you give me some examples?
They'd watch a company come out, and then they would explain how they were doing their makeup, or what concerts they were going to, or what made them excited. We realized the product was a lot bigger than finance, which made me really excited.
Understandably, in use cases.
So, we really started digging into it. Like any product, I always thought it would be really powerful. You can change the world if in one generation we shared why people were excited about different things.
One of the goals we've built in ibble is unlocking things that you don't realize you're curious about. I think every other social media platform — Facebook groups, Instagram, and everything else — you can follow and you can find stuff that you're curious about. But there's stuff that you don't realize you're curious about. You might be swiping through other forms of social media and be like, "Wow, that's really interesting. I'm excited about locks, or woodworking..."
Locks are pretty interesting, especially if you know how to break them.
I am obsessed with the LockPickingLawyer.
Yeah, he's pretty neat.
We thought that if people could share those conversations and allow people to jump in... Because we've been in this world with social media; I'm an influencer, I'm a celebrity, and I'm speaking down to you. We didn't want that to occur anymore.
We wanted to say, "Let's speak with each other. Let's have conversations with each other." That's how ibble was born — a short-form social media platform focused on conversations. Respond how and when you want to respond how, video, audio, text pictures, and when.
It's asynchronous, so much like a text message. You don't have to answer it immediately. You can pick it up when you want. We allow people to fork conversations and spawn them and give references of where the ideas came from. We’ve just been exploring those different ideas as time has gone.
It's really exciting. When you're talking about the future of your company, I think it's useful to think about where is social, in general, going. One of the things we've seen is there's an app for something. There’s the 'Uber' for whatever. There's a social app for something in almost every case.
It feels like there are two completely different things happening. Number one, there's a balkanization happening. For instance, Facebook just announced they may pull some of their apps away from Europe because European law is making it too onerous on them. So there might be some balkanization going on where there's an app for Europe or an app for the United States.
There also could be technical apps, like an app for taking photos or an app for making videos, or some variant or sub-variant thereof. Where do you see things going in the next 5,10, or 20 years?
We have that compression or consolidation and then expansion. And I think we're constantly going through that in every industry. We’ve seen it happen years ago where AOL did everything and then MySpace packed on every feature, and then Facebook expands it, and then the explosion. Instagram and Twitter and all these other ones became those subsets.
I think people are getting sick of platforms. What we really want to focus on is people not platforms. We don't care what platform you're on. Also, I think a lot of these platforms have lost trust in their users.
So we're trying to remove the fat and get back to what social media was supposed to be. I think that has always been connecting people and making the world feel a little smaller and closer together. But, man, we've gotten away from that the last decade. I think there's a simpler way to fix it.
Humans are tribalistic. We're supposed to be social and have friend groups and social has weirdly made that harder.
You brought it up. I thought it was so ridiculous; we had a great conversation with some pretty influential people from Europe recently. I was explaining how they could use ibble. I said, "Listen, there's amazing tech that comes out of Europe. But we don't really hear about it much in the United States because a lot of social media platforms have localized themselves."
Unbeknownst to us, you're on Facebook over there, and you're going to see a lot of stuff on Facebook from Europe. When you're in the United States you're seeing stuff from US. Think about every time we're all swiping through TikTok and we see all the localization of our area or surrounding area, but we're not seeing what's happening in other countries.
I thought it would be really powerful if we started releasing news features within our product, allowing people to share their perspective or see other perspectives from geolocation.
Imagine what's happening with Ukraine and Russia. I think it'd be really impactful to see what someone on the ground in Ukraine was saying, what someone in Russia was saying about this, or someone in the surrounding countries was saying about this.
What we're seeing on the news in the United States is everything that we're saying about it. I think that's the wrong way to do it. If we're going to be a united world, if we're going to be friendly with each other, how you start with that is empathizing with each other and seeing how everybody thinks about things.
The lingua franca. If we can all speak the same language in whatever language that is, be it by technical means or actually speaking the real language. I think that serves to bring a lot of peace to society when it's done well.
That's the crazy part. If that's the core to a new social media platform, it doesn't seem far-fetched or difficult. But why aren't we doing it? It’s a weird question to ask.
About balkanization, there's also the balkanization of things like local laws. There can be laws associated with religious issues, for instance, the Prophet Mohammed drawings, or you can have certain disclosure pieces of legislation. Exactly the same thing that Facebook is facing right now. You can have them even within the United States.
How do you see that evolving? How is that going to change how ibble works in the future?
It's a really hard problem to solve. But when we think about building a social media platform, we have thousands and thousands of years of human interaction from different societies that have shown us what works and what doesn't. The reality is, we know how to go into a room and have a conversation with each other from different backgrounds, and not to be a jerk and insult people.
So, I think when you're building a technology platform, you should mimic what's already there. All you're trying to do is digitize what we already do in the world. That’s the weird thing that these social media platforms are doing. They're trying to go and put a lens on things and shift off how human behavior is.
The way I think about building a social media platform is we have three rules at ibble: no violence, no hate speech, and no implicit sexual content on the platform. If you can go and follow all three of those, you might have someone talking about something over here and someone talking about something over here.
If you don't like it, it's as simple as hitting the overflow menu and saying, "Don't show me this anymore." Or you just swipe up and the algorithm is smart enough to know, okay, that person doesn't like that type of content.
It's like regular life. If you're walking down the street or walking through a park, and someone's saying something, and it's not harmful, it's not going to create violence, it's none of those things, but you just don't like it, you acknowledge it, or you walk right past it. You move on to something else. I think social media platforms should operate the same way. Interact with the content that you want, find the community that you engage with.
But the big problem a lot of platforms get into is they're trying to be Big Brother, Big Sister. They're jumping in there. They're making choices on our behalf and saying, “I don't think you should see that, or I don't like that this is being said. We’re all grown adults. We live in this real world
Not all of us.
Well, but why should the digital world be different than the real world?
I agree. I'll get back to your rules there in a second because I do think that's interesting to unpack. One thing I want to say about video in general, it has an interesting effect of attribution.
It's really difficult even with modern technology to create very good deep fakes. You pretty much always know it's me if I'm talking.
Especially if it's coming from my account. Over time, what that does in your kind of platform where you can see the person who you're communicating with, you can see the expression.
It's not just the words, you can see the intonation, the tone. I think that adds a lot of color and it can create a prosocial environment.
To expand on that, as we're growing — even though we might not be sitting at the Facebook or TikTok levels — we have levels of engagement on our platform that are just incredible.
We do a lot of Q&As on the platform. Anybody can create an event that they want or create an ad-hoc thread and turn on Q&A and invite and uninvite people as they want. Someone asks you a great question and be like, "Hey, I know Robert. Robert’s, a great person to answer that item on security."
I can invite you into the thread. But because we allow the user to respond with video or audio, text, and pictures, too. What's been amazing is we've heard feedback from our creators.
They said this is the first time I've heard from my fans. This is the first time I've seen from my fans. Then on the opposite side, we've gone out and reviewed...
Is that a good thing or a bad thing?
I think it's amazing. It becomes really hard to be a jerk when the person is going to see or hear you. On the other side, it's the same thing. When someone responds to someone, if a celebrity responds to you, it's similar to what happens today.
Millions of people tweet at Elon all the time, hoping that he writes them. If Elon jumped on a video and responded to a question you asked, that's a badge of honor. So we've heard it on the other side. This is the first time I've seen my fans and heard from my fans.
Then the other side, the people that are asking the questions are saying, "This is the first time I feel seen by someone I look up to or someone I respect." What an impactful moment that it's happening right now on our platform.
Absolutely. It adds a lot of color in general. I have some very high-up Republican and Democrat friends. The Republicans' take on where this is going seems to be the government is probably going to have to run social media or at least some version of social.
Well, I know. But hear me out just so we can have a conversation about it. Their idea would be that you have to be an American citizen or abide by American law. You can be anywhere but A, they have to know who you are. You have to have an account where they can hang you if you're doing the bad thing or whatever.
But they also would say you can do anything you want it. It'd just be exactly like a street corner. That said, exactly like a street corner that is online forever. With all of the pros and cons associated with that.
Now the government running anything sounds bad when it comes to technology. I know some people with digital services so I can say that with a smile on my face.
They've been amazing at running stuff in the past.
Maybe I'll even get one of the guys from there over on this podcast. It would be an interesting take to see what a government-led social media company could provide in terms of an infinite amount of protection for freedom of speech as long as it was within the law.
No slander, no libel, no breaking actual no drug dealers or selling body parts or whatever. It also just sounds like a terrible idea. I can't quite put my finger on why. What do you think?
When I think of a government-led social media platform, I get really worried. We can't even agree in Congress and Senate what rules to follow. Even when it falls into rules, what is right and what is wrong.
When you say you're not sure what to point your finger at why this feels off, it just feels like it won't succeed. If it does succeed, it's going to be highly weighted in one direction.
The way I think about it is it's a slippery slope with technology companies. It doesn't have to be ours. There are plenty of platforms out there, plenty of people trying to do novel things.
You want to let innovation happen in the private sector and make sure rough guardrails are put in place. Let the users decide which route they want to go. That's at least my thought.
Let's take one of those guardrails. So the Kyle Rittenhouse case came up late last year. I was following it very closely for a number of different reasons. Leading into this podcast was one of the reasons.
I was doing the normal thing that Robert Hansen does, which is a ton of research. Which means searching for all the horrible things you could possibly imagine. That's for anyone who does research for a living. They know you should never look at my search history. It's pretty abysmal.
It was interesting to watch Kyle Rittenhouse's case, in particular on Facebook. Because one day, I could search for Kyle Rittenhouse and there's content. The very next day, suddenly the search results were barren.
It said there were 60,000 messages that day or whatever, but it wouldn't show me any of them. Then you might think, okay, that's just a conspiracy there. They just messed up. People make mistakes or whatever.
Then later, they removed the part that showed me how many results there were. It’s not like they just did one thing. They went through and made additional code changes to further make it more difficult to see what was going on.
The day that Kyle Rittenhouse was exonerated by a jury of his peers, suddenly it was available content. So I don't know what got in their head that that was something that made sense to them.
I suspect it was this violence rule. I suspect that they don't want to glorify violence. That's why they didn't want this to happen.
From my perspective, I wasn't particularly interested in the violence. I was more interested in the case law. How do you put in guardrails that both protect people who do not want to see someone getting murdered, especially over and over again?
It was very widely publicized, but also allowed people like myself or other researchers to do what they need to do to understand what's going on. Actively search for things and not be put into a black hole?
I wish I could get in their heads. Where did a platform with clean ideas and simple functionality become censored? Is it that you now have to start establishing new roles with the company? That that's their job; to protect the audience? I don't know. My stance is maybe it changes. We should go back in a few years and see somehow, did we get tainted along the way? I hope we don't.
That's part of what this conversation is, by the way, Raymond. I want to make sure that if you're going to do it, you do it because this is what your plan is. I don't want you to fall into disarray.
My goal of building ibble has always been to make conversations accessible. Hard ones, easy ones. If you're going to have content, have both sides of the coin. Don't censor one versus the other because then we're not growing. We’re not expanding our minds. It becomes an echo chamber.
I feel like a lot of social media platforms have become like this. I know I'd be really upset on our platform if someone requested that feature and said let's snipe the content. When we talk about our three rules of no violence on the platform...
Expound on that. What would that look like? What gets kicked off?
I have zero tolerance for bullying on the platform. I don't want to see a platform where people that are trying to find their people… What we started realizing — and you've seen this happen with Facebook groups is where ibble can grow and be different than a lot of places is, Instagram, Facebook, all these platforms they started with and they're still to this day in a lot of regards — you find the people that you know and you watch their content.
You can slowly find stuff along the way. Well, Facebook groups started morphing a little bit. You could find people you didn't know that were interested in things that you were. It’s kind of the Reddit world, dig into that.
We've been exploring that idea that we think we could revolutionize social media by making people readily accessible to jump in and have conversations about stuff that they don't even realize they were passionate about. That they didn't realize they were curious about.
They could learn something along the way. But if someone's coming on there and trying to troll or bullying, get off our platform. That's not what we're here for. You can do that somewhere else. Go have a good time.
To press that button a little bit, let's say someone is wrong. They're just wrong. Sky's orange. The sky is whatever, some random color.
The earth is flat.
There you go. How much of people correcting that person does it take for you to say that that's bullying? Or for you to say, "We’re going to get out of the way and let this person be re-educated?"
That's the whole idea of communities. If you believe that stuff, and you want to discuss it and you're super passionate about it, who should I be to go and get involved in saying no, stop? Moderate that?
That's the messed up thing about a lot of our platforms. Listen, if you don't believe you should drink water and every drink you should have is soda and you want to find a community circled around that. Go and find that community, hangout, discuss.
Have a good time. But I don't think we should be part of that censoring. I don't think we should be part of that reeducation. I'm obviously giving silly examples here. We're starting to see that more and more. They’re forcing people to put disclaimers on things. Then your example, they're hiding the content.
I just don't think it's right. It should be smart enough. The algo should be smart enough to know that you don't want to interact with that content. Just don't show it to you. We're creating solutions to problems that shouldn't exist. When I say "we" social media platforms. It's way easier than what we make it out to be.
It exposes, and maybe it might make some people upset along the way. But that's real life. You walk down the street, and you might interact with a few crazy people down Sixth Street Austin here. Don't let it bother you. You have to be grown up.
One thing I would say that is good about Reddit and to your point, Facebook groups is that sort of self-moderation. Yep, that group gets to decide what the rules of its own miniature group are.
Now, you may not like that, but you can always just create your own group if you want different.
If you want to isolate yourself into a corner, go ahead. We have the ability to go on ibble, just create a hashtag. We could create today. You can go and post and say #Robertisawesome. All the content gets collected under that hashtag.
In the simplest form, that's a group. You can start finding people to have conversations with. As you start having discussions or conversations or whatever else like that, you can leave them open-ended for anybody to join if you want to. You can close them. We thought of making the world simpler.
We just start with a stage. Who do you want to be on the stage with you? Who should be in the audience? Is it wide open in the audience or is it a closed loop in the audience? Do you want to invite anybody to join you on the stage? Or is the stage wide open?
When you start with just those base rules, this is how we do conferences. This is how we do school. Some teachers might choose to allow any student to get up and discuss anything they want. Some teachers may say, "My rule of this classroom is I'm talking and you're listening."
In the world, we're bound by these basic rules of how we interact in elementary school, in high school, in college, in vocational school, and work. Why should a platform be different?
A friend of mine used to work at Twitter. At the time, he brought me in to try to handle their bot problem. They have an enormous bot problem. He did not believe it was solvable but he was willing to hear me out.
I said, okay, here's what I would do. He immediately said, okay, that's a great idea. We should probably start doing all these things. Part of it was starting to instrument bots in particular that they know are bots. You can start with the easy ones.
The easiest way to do that is just buy a bunch of bots and point them out yourself. That's a collection of things you can monitor. So, during that process, he probably shouldn't have done this, but he was trolling other people in his department with these bots, trying to explain to them that these things do exist despite how much they said that they don't exist.
As a result, they eventually fired him instead of handling the bot problem. To my knowledge, they have still not handled that bot problem.
Is he looking for work?
He has since moved on to greener pastures. But this is an interesting problem. They have some quote they say, “Be careful of the person on the other end of the keyboard.” But what if there is no person on the other end of the keyboard?
What if we're talking about just an enormous onslaught of garbage? That looks like a lot of people are upset. But really, it's just the same person over and over again? Obviously, that is some type of harassment, but it looks an awful lot like social pressure.
How do you start tackling that problem?
We're starting to see you have the nefarious side of it. Like how do we steer audiences? We all probably aren't aware, but we're seeing the opposite side of it. How do we make each other happy?
I urge any of you to look at the numbers on TikTok and wonder if you believe them. Go look at comment rates versus likes versus everything else. They’re spoofing those numbers left and right to give you dopamine hits.
When I go back and I think of if you had a bot pointed at you, and you have one person trying to steer the public, I think that's what's happened with a lot of what a lot of social media platforms. But just not to make someone upset.
Imagine steering the persona of how people think or how they want to vote a certain way. A lot of these platforms are based on the typing. ibble's emphasis has not been necessarily on the typing. It's been on the person's voice. You hit on it, the intonation in their voice, how they look, the passion that they're speaking of if they're on video.
So the emphasis on the comments is less important on our platform. But it's obviously like the basis of Twitter. It's the basis of Reddit and a lot of other platforms. So naturally, you could go build systems with deep fakes and have a whole bunch of personas to go out there and do it.
It's difficult, but it's a lot easier if it's just text.
I don't know how long we will. That bot issue was, but...
Well, let's just say it hasn't been solved yet.
But if we go and move forward, we might be having this question in 10 years. I've seen online where people have taken these deep fake programs. There are now actors and actresses that are fully simulated right in the world so they don't have to worry about paying residuals and everything else out there.
So if we zoom this conversation forward a decade, wow, that might be a new way of manipulating the audience. You spin up 100,000 variants of Robert's face morphed with Raymond's and they're all talking about the negativity of stuff. It's a hard problem to solve.
This leads back to the government thing where they know who you are. I realized I understand that that's a very sketchy, dystopian future where the government controls communication.
It's not a world I want to be in. It's scary.
Me neither. Also, I just see it coming at the same time.
I will say what we're talking about here, even though it would take a lot of computational power and everything to go and do that, what's the next iteration going to be? You’re posing an amazing thought experiment.
Text is something. A lot of the viewers listening to this, it's probably a lot of computer science, folks. It wouldn't be hard to build a bot. You probably find 90% of the code online to get you started. With where Facebook is going with Meta on the virtualization of your face, not that far off.
We’re seeing how good protocols that are out there are doing text-to-voice, manipulation, and everything else. I see that being a few weeks project to go in, manipulate 1,000 or 10,000 people walking with your avatar in Meta. At least our stuff being in video and audio feels like maybe it's a decade away or five years away.
We’ve got time. We have time, but maybe not much. But we have time.
Scary, man. Where's my bourbon?
Exactly. I wanted to talk a little bit about the actual process of taking stuff down. Harassment of whatever kind, let's just say we've defined it, which I am not saying we have. But let's say you can say this thing is definitely harassment.
What is the process that you go through? Is there a human? Is there a committee? Is there a robot that just does a first pass? How do you handle incoming concerns about whatever the content would be?
I'll skate over the first one because we have some cool IP that sits against this. What we do is we image process every video that's up. If someone pulls out a knife, a gun, stuff like that, it detects it. But we don't know, are they just showing off their gun? Are they showing off their new knife?
Is it airsoft?
That stuff gets flagged. At that point, we have to do human interaction, because we're trying to train everything to say is this right? Is this wrong? That's the first level. The second level is, and we have this on the platform right now, you can go in and hit the overflow menu and say, report this content.
We've had some hot-button issues on our platform that got reported thousands of times. That rings the siren for us. We should take a look at this. If it's something that we're really worried about, like a gun coming onto the platform, we might even choose to keep the contents still up. But what we've done is we've stopped serving it up to the masses.
Your friends are going to see it. If I have 1,000 friends that are following me, and they like my content, they're going to see it. But for the people that don't know you, it might disappear for a small amount of time until human interaction gets to look at it and say, is this right? Is this wrong?
With time signatures, they're not watching a long video.
Yeah. We've also done some pretty cool stuff with other IPs that we've been testing. What we can do is, instead of taking out the whole clip, we can blur that exact second.
There's a reason why our platform right now is 18 and up. We’re trying to be thought leaders of a lot of stuff that will happen, letting younger folks below 18 onto the platform, COPPA laws. We think in the next few months, as we release all these new features, we could be the thought leaders of how to protect youth on the internet.
Ensure that there's their stuff, maybe their parents are like, "Hey, I don't, want them to see that." It becomes easy for us because for them to have an iPhone or Android, their parents are signing up with their account and attesting how old they are.
Back to the content itself for a second. 18 and under is a great example. Reddit handles this by buttons, you have to click which is not exactly a learner’s huddle. But let's say you've managed to get that part right.
What is the difference?
How are you defining something that's 18 and up like pornography, for instance, which is a big hairy mess, no pun intended? I would be interested to see how you'd handle something like the statue of David or someone's baby photo. Or something that is a human being naked but is not necessarily something you would immediately point to and say that is pornography?
That's why we had to start with the less rules than anything else. Even though there are some rules in place. If you look at Twitter's rule base or other people's or Facebook's, holy crap. Good luck trying to decipher what they want out of it. And ibble just goes and leaves.
For example, for the audience, one rule that they have is no promotion of violence. Yet, promotion means trying to get someone's content out there. So are they banning Marvel movies? Because those promote violence, and they clearly advertise on their platform? There’s definitely a double standard, even in their own rules.
Well, we've had our team internally moderate. What our goal is, especially at this point that we're 18, and up we get to test the boundaries here is try not to take down content. If somebody is on the platform, and they're doing something egregious, we will hide that content. We will take that content down.
But I can count on two hands when we've done that so far. We’re not about doing that.
Those were egregious cases, presumably?
A lot of hate speech, with some of it, and straight-up sexual content on the main feed. Outside of that, we try not to block the content. Someone’s going to want to see that stuff.
The way I think about it is if it's something that you just don't like being said, and that's where 90% of this stuff is, then swipe to the next video. The algo should be enough to figure it out. Or you want to go a step further, and long hold or go to the overflow menu and say, block this user, or just don't show me content like this anymore. You will not see it anymore.
What is that saying that a lot of people say? It’s the less than 1% of people on the internet doing 90% of the damage. By users reporting other users and us going in and looking and being like, "Wow, this person is a bad seed for what we're trying to do on the platform," it becomes easier to push them off the platform if it comes to it.
Again, when we think about the rate of why and how Facebook or Twitter or other platforms are banning people or even TikTok right now, for what we're talking about. I'm talking about someone getting beheaded. I'm talking about someone masturbating openly on the platform. Those are the things we're talking about. Someone threatening to kill someone. Not saying I don't like your idea.
The first one and the last one are obvious because those are breaking the law. I don't think anyone's going to complain about that one. The masturbating one you might get in trouble with whether that passes the Miller test. Because is that newsworthy? What if it was?
We have private rooms if you want to go down.
What if it's somebody noteworthy? What if it's the president of the United States doing it? Would that be something that passes the Miller test because it is something that would invoke the media to go that is noteworthy?
Again, a good thought experiment. We haven't run into these edgy cases yet. What you're exposing is you have to be willing to take a stance and say this is the line. Maybe some of these platforms just kept on pushing that line and going further and further.
What we're doing here with ibble is we're trying to hold that line as much as possible. People have been on the platform and are upset about why we haven't taken content down. We've had employees upset that we haven't taken content down.
We've been trying to hold that line as long as possible. The great thing is we're supported by a great group of advisors and VCs and everything else and they're willing to support us in that path of holding that line.
Let's talk a little bit about the safe harbor laws. This is one of the things that Facebook and Twitter use very heavily. They say “we’re kind of like a carrier. We're kind of like the telephone system." You can't hold AT&T accountable just because two people are talking over the phone and saying something bad.
It's supposed to pass that standard. But also, they get the benefit of saying well, you can't sue us because we're just a transport. We’re just a carrier. At the same time, they are publishers. Very clearly they are publishers. They publish everybody's content for them.
They host them, and then they take content down when they decide that that doesn't meet their standards. In many ways, they're no different than a news station or something.
With a whole bunch of different reporters bringing ideas and concepts and everything forward.
Exactly. There’s a bit of concern amongst the more conservative side that their side is not being seen. But what that ends up looking like is, well, maybe we need to have legislation to push back on that and make them so that they are now one of two things. Either they're not a carrier, which is interesting, or they are a publisher. One of the two.
You can't have it both ways. Subsequently, there has been a couple of pieces of legislation. The most recent one was HB 20, in Texas. This is a very fascinating piece of legislation. Apparently, it started life as some bail reform law and somehow ended up being a piece of anti-censorship legislation. It was aimed specifically at companies that have over 50 million active users, that are social media platforms specifically.
There's a huge amount of accountability built into the law that basically says that they have to report how they take people down and in what way and what law did they break. And if they didn't break the law, who reported it, and what did you do about it?
Effectively, making it extremely onerous to take down content that isn't meant to be taken down and if there isn't a legal reason or a legal basis to do it based on their terms of service. They still have the ability to say we don't allow this content, but it has to be clearly defined. Or they're just not allowed to do it, period.
Now, Facebook and Twitter fought back through an advocacy group. That group said that this is censorship of a different kind. You're basically saying that the government is now stepping in and forcing publishers to do something.
So currently, it is locked down. This is not currently a thing. But if you're a Texas-based resident and this does come live, suddenly it gives you a conduit to sue social media companies if they don't comply by leaving your content up unless it breaks some terms of service that's clearly defined.
What do you think about that?
There's a lot to unpack there. Everybody's trying to sue everybody. We get into this world all the time. Our legal has us look at this because, as we have a podcast studio and we help people get difficult conversations out, are we a publisher? Are we a platform?
What are you?
I think we're still figuring that out. The reason why this podcast studio started was the idea that there were a lot of influential people that could have great conversations like we're having right now.
We have a feature called Spark — the ability to spin off a conversation and reference where the idea came from. So we thought, well, if we're going to have that platform, a lot of social media is all about dancing and what you look like and the vanity side, and there's not a ton of highly impactful conversations that are on video. There's stuff on Reddit and very surface-level stuff on Twitter.
So we thought, let's have amazing conversations on barbecue and law and art and men's health and everything else and give these sparkable moments to allow people to go down. I think we're still trying to figure out how we blend here. The one way we've always approached It is we don't run our own podcast; ibble doesn't have it there.
So I think we lend more to being a platform than a publisher because we say whoever we think is a really impactful conversation, whatever it is, if they give us the case and we think the world needs it and it fits our user base, go ahead. Use the studio. Have a fun time.
We're even expanding this stuff to give people a way to get important conversations out. We don't make a cent from it. We provide the services and we think of it as goodwill to the community.
In that regard, we're not pushing content. So I feel like we're a platform, but I'm sure someone will argue it and say, no, you're steering stuff in a weird way. Or they'll make some excuses of what gets promoted through the training algorithm. So are we a publisher?
These are the parts of building a business that I don't get excited about. I try to focus on the product. I tried to focus on the goodwill that happens in the world. This, again, circles back to the whole issue with the government. As soon as we get into this world that everybody is suing each other and getting upset with each other, it takes away innovation and sucks up the important stuff that should be happening out in the world.
Speaking of existential threats, one of the things that I think is dangerous for you is you are not an ecosystem that can sit there by itself. You are supported by Apple and Google and whomever you host your platform with, Google Cloud, or Azure, or AWS. These are all the same problems over and over again.
And none of these companies have a particularly great track record of just allowing content to live. They all have taken content down. Also companies like Cloudflare. You're in a weird position because you can do everything right from a First Amendment perspective — allowing content to live, to engage in good conversation — and still be taken down by virtue of some other company deciding that they don't like your content. It doesn't meet their terms of service.
Have you navigated that already or have you thought through that?
When we started building the platform, we had a lot of highly influential people reach out to us. And to us, it was always a scaling problem. And I want this to come across clearly. We don't care. As long as there's no hate speech and violence and the implicit sexual nature on the platform, we don't care who's on the platform.
But sometimes it's a scaling problem. You don't want your platform to be seen as single-sided.
I wouldn't have anything to do with you if you were on either side, by the way.
I think we've done a fantastic job of trying to extend the olive branch for everybody to see this as a place to have awesome conversations on tarot card reading readings and makeup and military veterans. A little bit of everything. That's something that we've stood by up to this point.
I'm trying not to scoot around the question you're asking. I think we're going to get to that point where, as the platform grows, we're going to bring every kind of personality on board. Hopefully, it's not seen as steered in one way or the other. Then it becomes showing all sides of the conversation and letting people choose what they want to hear.
I think that's one of the hardest things to do when building a social media platform. It's crazy who comes out of the woodwork. As soon as you start getting traction, someone wants to pick it up versus aside.
I try not to live in fear of what Cloudflare, Amazon or these other platforms are going to get into. I think if we would have steered left or right or taken a stance on something versus another, someone's going to get upset. Someone's going to advocate to break the platform down and strip it down to nothing.
I choose not to live in fear of what someone else is going to do to the platform. I just try to follow what we've always heard; do what your mama has taught us. Be nice to people, they'll be nice back to you, and try to be a good host. That's how I think the platform should be.
We're being a good host to every personality and every background. I think if we do that, we have nothing to fear from the platforms. But if they want to take it down, so be it. I held by my virtues and I built something special. I'm not going to bend one way or the other because someone's going to say, "If you don't, you're gone."
I think the next version of HB 20 probably will have to take this into account. It will have to go after the content providers as well as the ISPs of the world, or the people who control where things can be hosted.
If not, none of these things have any teeth whatsoever. They can just say, "Nope, we're not going to allow Facebook anymore because we just don't like Facebook and their content."
Think about what's going to happen. We've talked about this at the beginning of the conversation, and I think that highly impactful conversations should happen around the world. And in opening us up to perspective and diversity and understanding of where each person is coming from, there are going to be pissed-off people along the way.
But we have to tell that story. It's public outcry if they censor that. Because if this becomes that ibble is the only one fighting this fight, and we're the only ones standing up for neutrality and diversity across the board, then we're going to be like everything else. US people are going to be talking to US people, Catholics are going to be talking to Catholics, Muslims are going to be talking to Muslims, and so on.
That's a great segue. Censorship and filter bubbles, there's a theory in economics that says that people will say, "I want to vote for XYZ candidate." But when I actually get to the voting booth, they start deciding which one they're going to pick. The revealed preference can often be completely different than what they've said.
That kind of flies in the face of traditional economic theory. They said they wanted this thing, they've been saying this the whole time, why would they change it at the very last second?
I think social media has its own kind of revealed preference. It reveals itself as wanting censorship. It reveals itself as wanting filter bubbles. And for those who don't know what filter bubbles are; if I type in "Egypt" and I'm in Egypt, I might find local news. If I type "Egypt" in the United States, I might find something about travel to Egypt or something. So very different content.
Worse yet, people may only be able to see the content that is most suited to them, despite the fact that it might be the content that is least good for them. Facebook ran a mood manipulation study in 2016 or something. I actually got in touch with a researcher and got the code. So I was able to see what it looked like.
It was very stupid; a very easy-to-write code. I think the amount of symbols that they use to detect what thing they wanted to make you feel good or bad was only maybe 50 or 100 terms in total. It was very short. But even in that short study, they were able to manipulate the moods of 40,000 people or something.
That is pretty terrifying that they would run that kind of experiment, and it was so simple. All they did was decide that these people are only going to get to see bad things that put them in a bad mood — theoretically, they don't know for a fact if that was going to work or not — or only get to see good things. And sure enough, it actually did change people's moods.
That's the problem with filter bubbles. You get to decide what content they see and get to decide, as a content provider, what mood they're going to be in. How do you combat that? How do you make sure that that is not where you end up?
You're balancing out. Sometimes you go too far. You get away from what you're trying to do on the platform. For example, we're at 27 minutes of daily active engagement on the platform per day, which is incredible for any platform. And we're growing.
If I zoom forward a year from now, and highly impactful conversations are there, but we have some product managers sitting there and being like, "Okay, we're sitting at 45 minutes." Like TikTok, I think, they're like 55 minutes per day. "We need to pull another 10 minutes out of it." Are we doing harm to people? At what point is this now a revenue thing that you're trying to squeeze energy out of people?
I've never thought about our product That way. I haven't urged anybody to go in that direction.
How about your shareholders?
What I thought was really impactful is our VCs; they were the seed investors behind Instagram, WhatsApp, and LinkedIn when they were at Sequoia. They're an amazing VC fund in Columbus, Ohio.
What I thought was amazing, is they saw how a lot of social media platforms have lost their way. I don't want to speak for them, but I think they would support us in the journey of what brings value. What are we trying to do with this platform?
Is there a conscience there?
I don't want to give quotes of what they've said. But they definitely have realized that in the past, maybe they went astray on a few things. Yeah, it became a lot of value, but at some point, you think of the mental health of the world.
Look how destructive social media has become. People are focused so much on vanity and depression is spawning from that and suicide and all these other things. I don't know if I'm answering your question.
How do you prevent the filter bubble?
You want to do localization. Now, this is the engineer turning on. You want localization, because if I search "food", you want to see stuff that you can get out into the world and engage with. So if I search "pizza", I probably want to see something like Marye's Austin stuff around the United States, blended with other areas. That's a perfect example of bringing value.
Now, let's talk about the way the two hide things. You were talking about the Rittenhouse example. You search something and it shows only one side of the lens, or it hides it and sways you away from it.
I still think if you're actively searching for something, show them the content. And if you're passively searching stuff, we should know enough about what you're interested in to show you that content. So, if you're searching "Egypt", what brings you value?
If a story is trending in Egypt right now, I think it's the responsibility of the platform to show what's trending. And not be like, "No, we don't want to show a hot ticket item that's happening in London or Egypt or wherever around the world. We only want to show the happy-go-lucky aspect to it."
So as you're building the product, you shouldn't overcomplicate it. I think that's where a lot of these platforms get into. Because why am I searching Egypt? Well, maybe there was something weird that just happened. Maybe there was political turmoil that's happening over there. Why aren't we showing that story? Maybe there's a new election that's happening over there and people are interested in it. Why are we showing that story?
Why aren't we showing people having conversations about it? Specifically, why aren't we showing people from those countries? If you're searching those different items, why aren't you showing people discuss those items from over there? I think that gets into how do you display the content? How do you let them circle around it? How do you let them filter through it?
That is certainly going to be a challenge in the future for you.
It's a UI problem actually because you want to surface the important stuff, but you don't want to lose aspect to the other stuff. I think a lot of these platforms get so focused on tuning the algo into, "Spend more time here so we can throw more ads your way." That's not necessarily how we think about doing things.
That leads me to my next question. How do you see content that is extremely shareable, but it's also clearly just clickbait and designed to inflame people? Obviously, it is good for ad revenue, incredibly good. But this is the saccharin of content; this is the worst thing for you.
We're approaching things slightly different. Everybody went in the direction of ads back in the day, and we didn't steer into that. We want to be a creator-focused ecosystem. So things that we're moving towards are tips, subscriptions, and paid events and clubs.
Think about how many creators you watch probably on a daily basis. You wake up in the morning, you see their content, and you smile or you feel better about yourself, and you're learning something. I think it's a weird world that we're not helping support their growth.
If they're doing that and making 1000s of people happy, or maybe they're sharing mental health things that make someone not commit suicide, why aren't we making it easy for them to make a career and a life from creating content?
A lot of the features we're trying to do is there will always be free aspects of ibble. Maybe if you're a personal trainer, you're sharing free content. But if you want to get into the nitty gritty of what the meal plans are for the day, that's behind a paywall with subscriptions. If you want to go and sit in a seminar talking about how to learn a new skill, maybe that's a paid event. We think of that creator economy that we're trying to support. As of right now, ads are not even on our roadmap.
An adage amongst my friends goes something like this: Once upon a time, the smartest people in the world worked in government. Then they all move to the stock market, and now they're all in ad revenue. The smartest people in the world are just putting tiny little boxes of ads next to search results.
So if you're not going to do the ad model, how do you monetize your user base? Or are you just going to say, "This is now on the content provider"? And since you're not the content provider, you're just a platform, how do you enable them to make as much revenue as they possibly can so you can share some of that upside?
The reason that's important is if you're not successful in this business, if you can't make a go of it based on them making a go of it, this won't work. This is an interesting experiment where you're sort of more of a substack model. You have a whole bunch of people out there making content, and hopefully, it works. Otherwise, this thing is just a very big expense.
We've put a lot of ideas and thoughts into this. Our pitch deck was floating around out there. Now we're seeing Twitter, Facebook and all these other platforms trying to release features that kind of mirror the direction we want to go into.
We've always believed that support your creators, take a small cut of it, and you become the intermediary. Because what do we see during the pandemic? Only fans became multi-billion dollars of revenue.
So I can tell you we have an influencer relations team here. We work with hundreds of influencers, even right now. And what's really impactful is, until you get to a certain size, Twitter, TikTok, Facebook, they don't even want to deal with you. When you're in the world of 1,000 followers or 100,000 followers, 50,000 followers, you're still driving traffic, you're still creating content. You should still be able to make a revenue off this content.
So I think if you can create the automated tools that do it and support you in that route, and we take a small cut along the way. It's kind of like the direction that YouTube tried to be. But then because YouTube is sitting on the ad model that's on the other side of it, and they demonetize the channels, and they're like, "Hey, figure it out yourself."
Then you gotta go out there and find your own sponsors, and hire a sponsorship team and everything else. We strip that all back and make it really easy for creators to create fun and engaging content and make money.
Is there a more ethical way to promote content that has a better outcome for society? This is the upside of the filter bubble. You know that there's something that if everyone watched this, their day would be a little bit better. How do you curate that content in a way that is both useful, but also not destroying someone who's trying to do their research and find the not so nice parts of the internet so that they can do their job?
That's the hardest thing. You can see through the data of what's a highly engaged piece of content. But I think it's the platform that dictates what that content is.
Again, we'll get 55 minutes of daily active engagement on TikTok, but for the longest time before they started adding Learn on TikTok and all these other things — the direction that we went down a long time ago — it was just a dancing platform. It's in their name, TikTok, Metrodome. It's focused on dancing and music and everything else.
Was that 55 minutes of time that you were sitting on the app highly engaging and teaching you anything? Or was that just brain-drumming? Is that the equivalent of what MTV did years ago or the History Channel did years ago in finding something that we will be glued to our TV and wide open and just consuming stuff?
Are we actually learning? Or at some point, has it gone past that and it's in one ear and out the other and it's dampening our life? Time will tell.
I think 27 minutes of daily active engagement for us right now is great. We're not trying to optimize every second or anything else. What we're trying to focus on is the amount of conversations that spawn from the platform. I feel like that's a better direction that brings more value to the platform.
Back to your point, is that us playing around with the filter bubble to make people feel happier because they're participating in conversations? I guess you could take it that way because we're trying different things to make people feel involved in an inclusive environment where they feel like they can spin off a conversation, invite their friends and discuss things or find new friends on a platform. It's a very thin line of how you're optimizing your time.
To that point, I think one of the things a lot of these companies run afoul of is they pick their favorites. But the problem is their favorites aren't very good. And they tend to be journalists; their favorites.
I don't actually have a ton against journalists themselves, but they don't really exist anymore. Once upon a time, journalism was a real thing. You could find them and they were hard-hitting and they really wanted to do a good job.
Now it's all clickbaity, huh?
Yeah, and now it's all driven by ads. It has a lot to do with the fact that newspapers aren't really a thing anymore. I mean, when is the last time you picked up a newspaper?
That's a crazy thought. The last time I actually picked up a newspaper was maybe like five years.
Me too. That just shows that the original model that they had is gone.
I miss going through those classified ads, looking for new cars and the stuff that I would never buy.
Well, that's how you got your stock picks. You actually went through this long page of all these stocks. It was quite a while ago. But the way I like to talk about them is journalism is just a failed profession. It just doesn't really exist anymore. What we have is editorial; we have a lot of editorial.
Editorial could be okay, and sometimes it is actually pretty good, but you really have to search for it. You just can't reliably pick some outlets and say their editorials are good. It just doesn't work.
I had a long conversation with someone the other day about this. Movies and journalism and everything else, if we would go back 30 years, you could argue even today, you had the same 20 actors creating 90% of the volume of movies. You would have most of the content coming out from the same reporters and the same journalist. You just knew them by their name.
I remember growing up and watching Barbara Walters, or in Chicago, one of my favorite writers. And you would just see the lens of them. I think the good of what's come out of social media is it's exposed to the world that there are a lot of different perspectives. You're not following one or two, you could follow 1000s, and see everything from all different lenses in all different areas, and all different perspectives.
I think that was something that was exciting that happened along the way. I do agree, in a lot of regards, journalism is dead, but I feel like it's just mutated. Because you're not giving these long analyses of what's happening in the world, but you're dealing with a society that needs the quick bite of what's happening, get to the point, and give your references really quickly.
And therefore, total lack of nuance. That's pretty dangerous.
I follow some people on other forms of social media. Not pushing ibble, but something that we've been exploring is the idea that each clip is up to 90 seconds. And if you are done with it, you swipe up and move on to something else. But if you want to dig in, you can swipe in. And you can put as many references as you want. And we're really digging into the reference source.
So someone like you and I, and we might get curious about something else, and then swipe to the end, "Here's the Wikipedia article, or the medical journal, or the MIT journal." So it allows you to be like, "Cool, I think I got what's happening, and you've sparked my curiosity enough for me to dig in."
But right now, what it forces us to do is do that research ourselves. What this is doing is it's trying to put you in the lens of, if I'm watching a Robert Hansen thread, you could follow his thought process and be like, "Here's all my backing and proof and everything you should read if you want to follow my route."
You can leave the platform and go somewhere else. But also, you could watch people spin off the conversation, put their references and their thoughts and their links and everything else.
That's how I've tried to solve that route, which is, how do you condense thoughts and quickly evaluate? Do you want to learn more? Do you not want to learn more? If you do, let's put them all in a collected area, at least to make the research a little easier.
So I agree, in a lot of regards, journalism has mutated into something else. But if used right, I believe that you could quickly dig in and understand and compress that research cycle into something much easier to get through a lot more content per day.
And this gets back to what we talked about in the beginning. College is the same amount of four years. Well, if I can make those four years condensed with more information, if I can make the amount of time that we spend per day doing research, condense that you've digested a lot more information, you're a more rounded person, you have a bigger background on things, and you can discuss things in more breadth.
That's a perfect segue. There are lots of different things. Facebook can make someone feel angry or they can make them happy. I think ibble could make people smart in a very interesting way. I think by virtue of the fact that you're able to see someone's expression, first of all, you're getting some nuance from them. You put a question mark; are they being snarky or they really just don't know?
That's a very big difference that you don't get in text form. And all these platforms are almost entirely in text form or long form, which is another problem. It's hard for people to get through all this very long-form content, speaking as someone who's doing something long-form right this second.
But I think there is something about both the ability to spark these conversations. So now you have a thread. This thread is content that is all related to one another. And it's going to be more civil because there's attribution. People know who these people are. You've got your names on there and your faces on there, too.
You can't ignore that this person is saying this very specific thing, because that's the way they're saying it. They use some inappropriate language. Are they saying it as a joke or this is what they really feel? These are very different things and that nuance is lost.
How do you see ibble's role in helping people get smarter? Like there citation thing is a great example of that.
We had that in one of our pitch decks early on. I get really excited about the idea that within one generation or even a few years, if I can compress more information and more perspective, even 10% more into someone that's evolving their young brain, they're going to be exposed to so much more information. They're going to come out of this with way more knowledge.
I think we're not the only ones solving it. We've all seen TikTok mutate recently with a lot more learn concepts on there and their blend between edutainment. You're going to see a funny dance video, and then a second later, you're seeing someone else share some facts about something. We just go that step further because where it ends is, again, broadcast mode.
I've put something out there, and it's stopped. Well, how do we learn? We learn in classrooms, not by listening to the teacher. They figured this out over the years. The reason why we do tests is it forces us to do homework and prove that we were doing homework because through example and through process, we make what we've heard reality and we sink it into our brains.
The idea of ibble was to spin off the conversation, give a reference to it. Where this would live on ibble is all these 50 different subject matters, we can highlight all the different aspects. And we can see maybe SoundCloud, 90% of the audience hated it, but 10% of them really circled in on something. And now there's a hot ticket item that everybody is discussing. And now we're seeing all the references and discussions and everything circling about it.
Then it allows us to go back into those conversations. So like, when we forget this conversation, we high-fived each other, we go home and have a good night. Well, if this was on ibble a day later, or when this is on nibble, a day later, we'd go back in there and see all the spin off conversations that have happened and rejoin those points and expand on what we're thinking or hear from our audience of how they were thinking about it.
So it's the core of what ibble is. I mentioned we started off as an education platform. We bring it forward as a great way to have conversations. But we're not just having conversations for the fact of wasting each other's life. We're having conversations to see each other differently and hear each other differently and learn from each other. That's why I say we're an education platform as a core. That's what I mean by that. I think it's just a powerful way of learning from each other.
Something I don't think I've ever publicly said out loud is that I am extraordinarily dyslexic. Like off the charts. I took a test — a buddy of mine has a pretty cool app he was building and he wanted me to beta test it — and I got to think at 27 out of 29 on the Dyslexic scale. Just about as dyslexic as you possibly can be.
I have a very difficult time getting through even just a couple of sentences. It takes a lot of work, but my retention is high. So it makes me smart. It makes me delegate a lot to other people to have them read things for me, digest them, and give them to me. Podcasts are great because I can hear people without having to sit down and try to parse out sentences.
I was mentioning to Robert before we started this that I've never seen someone show up like this. If you can see Roberts's lap with the biggest layout of a discussion ever. I was like, we're talking all about that? Is this like a whole season? But it helps.
It helps me get through things. Everyone has their method of dealing with the way their brain works. I've managed to survive okay, except when it comes to social. Social, I am a mess because when I write things down, it looks one way to me. Then I'll hit Enter, and then I'll look away, and I'll look back, and I immediately spot three errors.
How did I miss that? The words aren't even close. It's not even like in the ballpark of what you might be thinking is the right words. How do you enable people to be people and make mistakes? Even RAM has error-correcting memory. The ability for computers to make mistakes shouldn't be discounted. If we expect computers to make mistakes, how do we deal with humans making mistakes?
Your mind and my mind are similar. How this platform got built is I had someone really smart tell me one day, "Raymond, I think what you built was trying to dissect how your mind worked."
ibble is kind of an AI Raymond and an AI team. Like, different people that work here at ibble, we think about how we solve problems and how we communicate, and how we do different things. We basically built features around that. Much like I said earlier, how do we solve events? How do we solve seizures? We just look at what's happening around the world and we digitize it.
Back to your point, what we do is we'll have live functionality coming up shortly, but right now, everything's asynchronous. It's up to 90 seconds and much like dropping a text message. The really cool thing is this conversation could have unpacked, instead of a dedicated time like two hours sitting here, it could have unpacked over a two-day period.
People that subscribe to the threat could watch this thing much like watching a thread on Reddit, unpack. And the great thing is because you're confined to 90-second clips, you kind of think of those like chapters. What do you really want to get out? What do you want to share? What references do you want to put there and drop that thought out there.
But the really exciting thing that happens along the way is because the platform is live, when that content drops, your fans can interact. So questions might come up. They might start spinning off those conversations. And you might be thinking of your next video.
But what you really should do is you should just wait and listen to how they are interacting, because maybe they don't want to hear more. Maybe they want to steer your conversation. Maybe they want you to pause and come back to something.
That's a really unique aspect of ibble versus a normal podcast where we're going for two hours. Maybe there's something like, "Hold on, can we expand that." If there was a third person in the room, this conversation would be way different.
The other way you can think about is let's say you record your 90 seconds, and you watch the playback even before you hit "Post" and it is incoherent. You did not hit your aspect at all. Delete it, and re-record it because it's not live yet.
Or you pushed it live, and you went back and looked at it two minutes later, and you saw three comments come in. It's not like you're deleting the entire thread, you can just remove that one aspect. And I do it all the time.
I'll put something up, and I'll think about it a minute later. It's like, "That's not the way I want to explain it. I'm seeing the first two comments come in, and I'm like, "People are confused, let's rewind." Maybe I have 10 other posts in the thread. I can just delete one and pick it back up where I left off.
So it's kind of like an undo button or a redo button that you can go to. The most important thing about comedians versus a podcast; podcast, we're focused on each other, but a comedian is focused on the crowd. They'll tell a joke and look out in the crowd. Did they get a reaction? Did anybody smile? And if they did it, good comedians will steer the conversation and they'll figure out their audience.
That's the really exciting thing that you can do with it because it's kind of that same aspect. You can explain something or you can move on to something else and see how your fans are interacting. That gives pivotal points along the way.
And for someone that might be battling dyslexia or ADHD or something like that, they can condense down their thoughts, get something out there and step away for a few minutes. And they don't have to feel like they're glued to the platform.
To wind down this thing, if you were to say the top couple of things that you want to do with ibble over the next coming year. I know you're released just in the United States. Is that correct?
Yeah, fingers crossed. But probably by the time this airs, we'll be available in Canada and Mexico. Then we're working diligently to get this out into Europe. Those will be good pause points for us to go and see how the fans interact and how everything blends. Then we'll slowly keep on releasing it wider and wider.
Where can they get it?
It's available on the Android Play Store and Apple iTunes Store so they can download it on either device. We have plans eventually to release more stuff on the web. You can watch most of the conversations on the web right now if you're not in the countries that we mentioned. But the fun aspect of it becomes learning the core idea.
We're about conversations. Swipe up to find new stuff, swipe in to learn more, and if you want to find something really dedicated, search for it. Find a hashtag and dig in there.
I think the reactions you're going to get, at least on other platforms, are probably that people are bummed that they can't get on ibble. That's probably a good place to be.
We've had a lot of people, surprisingly, in Europe, super excited to be there. And then I didn't realize we had such a big fan base in Canada. We've had people finding other ways to install the app and get out there. We're just trying to follow the rules of what Apple puts out there for us.
Well, thank you very much for doing this. I know that was a heavy conversation, but you did great. I hope to have many more of these offline with you. And what we can't do offline, let's do on ibble.
This was great. Thank you so much for coming in and having the heaviest conversations ever. I think you're going to have fantastic conversations here and we're super happy to have you.
No Transcripts Are Available Yet