I can see the subtle jabs and polite-ish mockery when I tell people that I started my career in crypto and moved to AI in 2025. That’s fair, I suppose.

Anywho, there are some really interesting points of cultural contrast between the two that I think you can only really understand if you’ve spent real time in both.

There is a longer post in this one, but that’s for later.

The People

Crypto is the best example of a barbell industry. On one side, you will meet some of the genuinely smartest people you have ever, and will ever, meet in your life. I’ve met Russian cryptographers building publicly accessible community organization software on blockchain rails to organize protests against the invasion of Ukraine. I know a kid in high school who led development, and built a company around, a completely novel Ethereum L2 solution that increases transaction throughput while achieving the same security and trust assumptions of the base chain.

In short, there are wizards everywhere and powerful minds in abundance.

On the other side of the barbell, you have people who are pure value extractors in the absolute worst way. It’s one thing to max extract with cool MEV solutions that take advantage of consensus mechanism inefficiencies and protocol design problems, but these people just shill garbage and dump it on you in the end. Pure grifters, through and through. When I was a VC in the space, I met with some founders we were considering investing in over drinks where they told me how much money they had made explicitly from insider trading memecoins.

Bad.

In AI, the distribution is far flatter. We were recently at the AI Engineer conference in New York, and were shocked to see that there is a pretty normal range of people. You’ve got a few outliers, who are clearly operating on their own plane, for sure, but otherwise it felt more like a random sampling of tech people, and nothing particularly captivating about the dynamics at all.

Perhaps this is simply a function of size. AI is orders of magnitude larger than crypto, so it’s possible things just even out in the long run.

My initial sense is that this doesn’t quite tell the whole picture.

Telos

Crypto, especially in the early days, had a very clear organizing telos of self-sovereignty. The promise of overthrowing the gated financial institution of the past and instilling a future where you controlled your own destiny with unfettered access to all the tools typically only given to those sitting in Wall St. skyscrapers was strong.

Being motivated by, in my experience, does correlate with a certain type of intelligence. In general, people are self-taught, self-directed, and self-empowering in crypto.

Furthermore, this telos provided something tangible and vivid to run towards. Having a carrot to motivate made crypto a fun industry, at least in the early days. (Speaking less from experience here, I only joined in earnest in 2021 after graduating).

It’s hard to describe what the AI telos is. There are vague terms thrown around like “age of abundance”, or “the end of work”, and other somewhat meaningless shit like that, but those are not vivid images.

The race to build AGI is a race to the Schwarzschild Radius. Sure, it’s possible that once we are on the inevitable path to the singularity, it might spit us out into Sam Altman’s heaven realm, but we don’t know and we risk spaghettification.

Everyone feels this. As a result, the industry is so self-serious. Crypto twitter was rife with mini-experiments and new protocols and weird NFT projects to play with. AI twitter is a panicked echo chamber of Arxiv articles and tacit dread about the next Anthropic release nuking another wave of startups.

Ending Awkwardly

There are a million things to develop here and I have them all saved in a secret note that’s going to turn into a great post one day, but that day is not today.

I don’t really miss crypto. I never fully aligned with the main telos, and I’m not particularly interested in finance, but boy do I miss a lot of the people. I know these people exist in AI, and that it’s likely that the size scatters the signalers, but I’ve not met them yet.

AI needs a better picture of what we’re going towards, and the leaders need to do a 10x better job of understanding what’s gonna happen when we’re at the singularity. Altman weakly gestured at it in his Gentle Singularity post, but it’s still filled with the vague gesturing at life through the black hole and that’s not enough.