
What Builders Talk About When They Talk About AI | Andreessen Horowitz

My basic view is that inference will not get that much more expensive. The basic logic of the scaling laws is that if you increase compute by a factor of n, you need to increase data by a factor of the square root of n and the size of the model by a factor of square root of n. That square root basically means that the model itself does not get that... See more
Sarah Wang • What Builders Talk About When They Talk About AI | Andreessen Horowitz
But just like the internet, someone will show up later and think about something like Uber and cab driving. Someone else showed up and thought, “hey, I wanna check out my friends on Facebook.” Those end up being huge businesses, and it’s not just going to be one model that OpenAI or Databricks or Anthropic or someone builds, and that model will dom... See more
Sarah Wang • What Builders Talk About When They Talk About AI | Andreessen Horowitz
Not sure about this one. Just an interesting snippet. I figure there are some reinforcing loops in the data, where the models get better with more data, attracting more users, generating more data. At the same time, I believe there are huge advantages in the knowledge on how to train and how to manage inference at scale, which makes a huge difference. I do not see anyone catching up to OpenAI at the moment, especially with their new finetuning offer.
An interesting factor might be figuring out the right data mix for pre-training and using a better screeing to weed out unwanted behavior. Whoever can figure that out at scale might have a huge advantage, if they can keep it a secret.
Very simply, Jevons Paradox states: if the demand is elastic and the price drops, the demand will more than make up for it. Normally, far more than make up for it. This is absolutely the case of the internet. You get more value and more productivity. I personally believe when it comes to any creative asset or work automation, the demand is elastic.... See more
Sarah Wang • What Builders Talk About When They Talk About AI | Andreessen Horowitz
I think this will be true for AI in general. ChatGPT starts out entering the enterprise, they get accustomed to it, they see the tangible results, and want to try it elsewhere without wanting an exact quantification in cost savings / revenue increases (which is often too hard to do for AI).
It will make things way better if you can just dump in massive amounts of information. It should be able to know a billion things about you. The HBM bandwidth is there.
—Noam Shazeer, Character.AI
—Noam Shazeer, Character.AI
Sarah Wang • What Builders Talk About When They Talk About AI | Andreessen Horowitz
I actually disagree with this. I think a bunch of further improvement will come by figuring out better ways to finding important information in databases (instead of just simarlity search) and figuring out how to extract relevant information from conversations, documents, etc. so we can provide only relevant information.
how can you take the knowledge work that someone is doing and use AI to help them be dramatically more productive at doing that particular flavor of cognitive work? In our observation with developers, more than anything else, AI helps keep them in flow state longer than they otherwise would. Rather than hitting a blocker when you’re writing a chunk... See more
Sarah Wang • What Builders Talk About When They Talk About AI | Andreessen Horowitz
We have lots and lots of customers who want to have specialized models that are cheaper, smaller, and have really high accuracy and performance. They’re saying, “Hey, this is what I want to do. I want to classify this particular defect in the manufacturing process from these pictures really well.” And there, the accuracy matters. Every ounce of acc... See more
Sarah Wang • What Builders Talk About When They Talk About AI | Andreessen Horowitz
I think ChatGPT will bring a bunch of companies really into AI. They will start out trying to solve things with GPT and then try to figure out how to solve this better with bespoke, smaller model. It could trigger a new AI summer for deep learning.
If you made a thousand versions of an LLM, that’s good at a thousand different things, and you have to load each of those into the GPUs and serve them, it becomes very expensive. The big holy grail right now that everybody’s looking for is: are there techniques, where you can just do small modifications where you can get really good results? There... See more
Sarah Wang • What Builders Talk About When They Talk About AI | Andreessen Horowitz
PEFT in a nutshell.
But if you could get AI to suggest interface elements to people and do that in a way that actually makes sense, I think that could unlock a whole new era of design in terms of creating contextual designs that are responsive to what the user’s intent is at that moment.
Sarah Wang • What Builders Talk About When They Talk About AI | Andreessen Horowitz
I have a Turing test question for AI: if we took AI in 1633 and trained on all the available information at that time, would it predict that the Earth or the sun is the center of the solar system—even though 99.9% of the information is saying the Earth is the center of the solar system? I think 5 years is right at the fringe, but if we were to run ... See more
Sarah Wang • What Builders Talk About When They Talk About AI | Andreessen Horowitz
There was a study out there that did this "time-line" training.