TIL: Vercel AI SDK - the bloat king
I was looking for a simple universal JS lib that could handle OpenAI/Anthropic/Google responses without having to install each client lib. Considering all these do is JSON requests, those client libs are actually quite big - for example, OpenAI is 87 kB on its own (bundled with Vite), though Google is only 25 kB.
From what I've seen, there's one universal library from Vercel called AI SDK. It has 870k weekly downloads on NPM and a fancy package name: just "ai" (I wonder how much that one cost).
It's split into 2 packages, and the "Core" one sounds promising! So how much does this "Core" add to the bundle when using it for just a single OpenAI request?
A whopping 186 kB! For a tool that simply calls an OpenAI endpoint. What is basically just a JSON request.
I really don't get Vercel and the whole ecosystem they built with VC money. Everything they do is the most overcomplicated, architecture-astronaut level stuff. Like here, just the "packages" subdir is 73 thousand lines of code. For a JSON API wrapper!
I mean, the "bad guys" story would be: let's create extremely bloated JS libs so that you'll pick our solutions which allow you to run these on our servers. But I really don't think this is the case - it's just that Vercel's company culture is all about overcomplicating everything.
I've read from many devs that you choose Vercel for the amazing DX, but I just really don't see that DX. Luckily, I haven't had to refactor Next.js codebases every year, but even from what I've seen, everything they do is the opposite of great DX.
Anyway, back to my original quest - I haven't found any minimalistic, universal JS lib that supports streaming from multiple providers, so I'll be writing it myself using the low-level "eventsource-parser" lib.
Has anyone found anything for this purpose?