
What Builders Talk About When They Talk About AI | Andreessen Horowitz

Sometimes they just need the model that actually fits for their specific use case and it’s far more economical. We want people to build on top of our models and we want to give them tools to make that easy. We want to give them more and more access and control, so you can bring your data and customize these models. And you can really focus on the l... See more
Sarah Wang • What Builders Talk About When They Talk About AI | Andreessen Horowitz
If you look at every technological shift or platform shift so far, it’s resulted in more things to design. You got the printing press, and then you have to figure out what you put on a page. More recently, mobile. You would think, “Okay. Less pixels, less designers.” But no, that’s when we saw the biggest explosion of designers.
—Dylan Field, Figma
—Dylan Field, Figma
Sarah Wang • What Builders Talk About When They Talk About AI | Andreessen Horowitz
Dylan Field on why AI will not mean less jobs
I have a Turing test question for AI: if we took AI in 1633 and trained on all the available information at that time, would it predict that the Earth or the sun is the center of the solar system—even though 99.9% of the information is saying the Earth is the center of the solar system? I think 5 years is right at the fringe, but if we were to run ... See more
Sarah Wang • What Builders Talk About When They Talk About AI | Andreessen Horowitz
There was a study out there that did this "time-line" training.
My basic view is that inference will not get that much more expensive. The basic logic of the scaling laws is that if you increase compute by a factor of n, you need to increase data by a factor of the square root of n and the size of the model by a factor of square root of n. That square root basically means that the model itself does not get that... See more
Sarah Wang • What Builders Talk About When They Talk About AI | Andreessen Horowitz
Powerful AI systems can help us interpret the neurons of weaker AI systems. And those interpretability insights often tell us a bit about how models work. And when they tell us how models work, they often suggest ways that those models could be better or more efficient. —Dario Amodei, Anthropic
Sarah Wang • What Builders Talk About When They Talk About AI | Andreessen Horowitz
It's a little bit of a conundrum. A model we do not understand explains another model we do not understand.
If you have to be correct and you’ve got a very long and fat tail of use cases, either you do all of the work technically or you hire people. Often, we hire people. That’s a variable cost. Second, because the tails of the solutions tend to be so long—think something like self-driving where there are so many exceptions that could possibly happen—the... See more
Sarah Wang • What Builders Talk About When They Talk About AI | Andreessen Horowitz
We have lots and lots of customers who want to have specialized models that are cheaper, smaller, and have really high accuracy and performance. They’re saying, “Hey, this is what I want to do. I want to classify this particular defect in the manufacturing process from these pictures really well.” And there, the accuracy matters. Every ounce of acc... See more
Sarah Wang • What Builders Talk About When They Talk About AI | Andreessen Horowitz
I think ChatGPT will bring a bunch of companies really into AI. They will start out trying to solve things with GPT and then try to figure out how to solve this better with bespoke, smaller model. It could trigger a new AI summer for deep learning.
But just like the internet, someone will show up later and think about something like Uber and cab driving. Someone else showed up and thought, “hey, I wanna check out my friends on Facebook.” Those end up being huge businesses, and it’s not just going to be one model that OpenAI or Databricks or Anthropic or someone builds, and that model will dom... See more
Sarah Wang • What Builders Talk About When They Talk About AI | Andreessen Horowitz
Not sure about this one. Just an interesting snippet. I figure there are some reinforcing loops in the data, where the models get better with more data, attracting more users, generating more data. At the same time, I believe there are huge advantages in the knowledge on how to train and how to manage inference at scale, which makes a huge difference. I do not see anyone catching up to OpenAI at the moment, especially with their new finetuning offer.
An interesting factor might be figuring out the right data mix for pre-training and using a better screeing to weed out unwanted behavior. Whoever can figure that out at scale might have a huge advantage, if they can keep it a secret.
It will make things way better if you can just dump in massive amounts of information. It should be able to know a billion things about you. The HBM bandwidth is there.
—Noam Shazeer, Character.AI
—Noam Shazeer, Character.AI
Sarah Wang • What Builders Talk About When They Talk About AI | Andreessen Horowitz
I actually disagree with this. I think a bunch of further improvement will come by figuring out better ways to finding important information in databases (instead of just simarlity search) and figuring out how to extract relevant information from conversations, documents, etc. so we can provide only relevant information.