Bing this🔎
Running a personal AI assistant, Mistral’s latest release, and why you should be using Bing Chat
Happy Friday & congrats on making it to the weekend! Now you just have to survive the extended close contact with your frenemies *cough* family.
Still need to score a couple last minute gifts? AI might just be the opportunity you're looking for. This lad tricked Chevy’s online chatbot into selling him a truck for one dollar. Whether or not the deal went through is one thing but nonetheless, GENIUS play Chris.
Quick Definitions (skip if you already know them!!)
LLM – Large language model, this is the general term for AI models like ChatGPT, which in practice, are incredibly smart and capable chatbots. For a more in depth definition, go here.
Model – this is just another way to describe a particular AI.
1 cool thing
Run your own personal AI from your computer
I'll admit up front that this is not going to be the most applicable tool for a lot of people BUT for those of you power users who are going to rule the world definitely lean in. LM Studio is a free software you can install on your computer that makes it very easy to download, modify, and run large language models locally on your device.
What does that mean?
The big name models like ChatGPT and Google Bard are hosted on big servers made up of hundreds of high end computer towers (hosted as in that’s where they live). As an example, let’s say Google Bard's big stack of computer brains is located in San Francisco. When you log in to bard.com and ask a question, your question is sent via the internet to this computer stack in San Francisco where Bard figures out the answer and then sends a message back. Like emailing back and forth with a centralized customer support center.
Now these leading models like GPT-4 are so freaking BIG that they need a bona fide building full of computers to work. But there are hundreds of smaller ai models, and new ones coming out every day, that require far less computer power to run. That's where LM studio comes in.
With LM studio, you can download one of these smaller models DIRECTLY onto your computer and run it entirely from your own device. When you ask your local AI a question, boom it's answered on your device. You don't even need an internet connection. You can download and run one of these models completely free and ask it a million questions and tasks all on your device. This allows you to have complete security over the data you are sharing and more control over fine tuning the model for your needs.
The big deal
An open source model better than ChatGPT using a novel approach

Quick context, in the AI company landscape, Mistral is the leading open source company. Where Microsoft, Google and others are mostly keeping their fancy new AI models secret, Mistral releases their models publicly for anyone to use or modify. Last week they released their latest AI model called Mixtral 8x7B. Hellofa name I know. Mixtral, not to be confused with Mistral🤣, is significant for two reasons:
It outperforms ChatGPT (3.5, the current best free model) on most rankings.
It is the first serious model to use a mixture of experts (MoE) approach
Okay okay I know the whole point of this newsletter is to avoid nonsense lingo like “mixture of experts”. Let me explain.
First, every other mainstream model currently out uses a ‘monolithic’ approach. When you make a request, whether it’s to write a Shakespeare play or simply lookup the winner of the 2004 world series, the entire model powers up and runs full hog to give you a comprehensive answer. Think of that server farm in the visual prior, the whole building of computers is powering up for every request.
When you ask this new Mixtral model a question, instead of using all of its brain power and memory and knowledge at once to answer your question, it actually chooses one specific part of its knowledge / training set to answer the question depending on the topic.
That’s why it is called ‘8x7B’ it has 8 different datasets / mini models within it trained on different subjects and tasks. Instead of one big dataset trained on everything from engineering to hair styles.
Ex: you ask Mixtral about how the Cold War ended, it decides to use its history / fact sorting subsection to answer.
This is really cool because it's the same way we work. When your friend asks you where the bathroom is in your house, you instinctually point to the nearest one. You do not stop and start critically thinking about the answer.
By enabling AI to choose the best part of its artificial brain for the task, Mistral has created a dramatically more efficient model, and cast doubt on the security of OpenAI’s lead in the AI race.
Make life EASY
Bing Chat drastically outperforms OpenAI’s premium ChatGPT-4
The first week of December I went over how to use the best ai models in the world for free. I must admit I didn't pay too much attention to my own advice because I pay $20 / month for OpenAI premium... until today.
My buddy Tim (AI power user) has been telling me these conspiracy theories about ChatGPT slowing down recently for the holidays because it knows we are all eating cookies and being lazy anyway.
Well I'll be damned not 10 mins into working on a project at work and it was the slowest I had ever seen ChatGPT respond. After suffering through it for an hour and hitting my usage cap. I had no choice but to hop over to Bing chat and use the more precise tab like I mentioned before and HOLY CRAP it was 10x faster than the portal I have literally been paying for access to!
On top of that the output even appeared to be higher quality.
So whether you are just getting started or a pro user, head over to Bing chat for way faster high quality responses.
That’s it for today folks. Hopefully you learned a little, laughed a little, and have a Merry Christmas!!
Smell ya later,
Joe
Thanks for reading AI for Apes! Subscribe for free to receive new posts every week.