HUGE LM Studio Update | Multi-Models with AutoGen ALL Local - Phụ đề song ngữ

Hey and welcome back to day 27 of 31.
We almost done, but today I have a wonderful update that came from LM Studio.
I talked about it briefly in yesterday's video, but what I'm talking about is multi models in LM Studio with auto gen.
No, I didn't mispronounce that and I didn't mean to say multi-modal and meant multi-modal.
Now what that means in a big update with LM Studio, video recently, we can now have more than one model running on one server.
And all we need to do is adjust the model property in the config list,
and we can have multiple config lists each with different models loaded from LM studio.
Let's take a look and see what I mean.
Okay, I'll put a video in the description where I talk about LM studio, how to download install it and get it running and kind of working away around it.
But in the latest update, we have the multi model sessions where you can load and prompt multiple local models simultaneously.
So does that work?
Well, once you download and install it and run Elmo Studio, you'll be greeted with this screen.
And the first thing you do is actually download at least two different models.
So on the home page here, you can just look at some of these here, like, uh, like Google's five to Quinn.
Um, you can come down here to get Zephyr and just go ahead and download a couple of these.
And then there is a new tab on the left hand side here called playgrounds.
So you'll click playground.
You'll see that they have a multi model session here.
Just click go.
And now what we can do is load a couple of models.
So up here, it says select models to load.
Choose this and I'm going to choose Phi two first once that one's done and choose another one
I'm going to choose stable lm zephyr because it's also a smaller model Okay,
and once you have that done Then just come over here on the left-hand side and click start server and we're up and running now
Let's create an other gen file where we can see how we have two different models from one lm studio software running at the same time
with the agents.
So we'll have is two different LLM configs.
I'm going to name the first one Zephyr in the second one, Phi two.
Now how we distinguish them is in our config list, we have a model base URL and API key.
With the new origin update, we can say lm.studio for the API key so it recognizes lm studio.
Then we have a base URL.
This is the same.
This always been the same for LM studio.
Now we have a model name really This is a model identifier from the model that we've loaded, right?
So I'll show you where to find this in a minute,
whichever assistant agent had the Zephyr at the LLM config definition They're going to use the Zephyr model now for a 5 to another LLM config
We had the config lists and then the model this is the identifier for the 5 to model Then we have the base URL,
which is the same in the API key,
which is all for the same also I might not have mentioned this before but for the cache seed you can set this to none
Meaning that it will never cache your results and every time you run this it will always be different
I have two agents I have one name Phil who's going to be using 5 to model and I have one name Zeph
Who's going to be using the Zephyr model and then I just have Phil initiate a chat with Zeph saying tell me a joke
That's it really simple
But this is going to show you how to have multiple models working together in one LM studio software now as I said
We had to get this identify here
Let's go over to LM studio and I'll show you where to get that so back at LM studio If you click for instance,
let's just click 5 to over here There's an API model
identifier You can just copy this and the same thing for Zephyr
You can just look at the API model identifier and then copy that as well.
All right after I ran it as we can see here This is the server logs for both the models and I mean it worked right here is a community all the tokens for the responses
back to But here we go, like so it worked, right?
So this is just, you can go back to LM Studio, look and you can see how the interaction went.
Okay, now if we go back to our IDE
and look at what happened here, we say Phil started talking the zip, tell me a joke.
Why did the tomato turn red?
Because it saw the salad dressing.
Ha ha, they had the audience, Great.
Okay.
Awesome.
What happened here?
Okay.
So again, let's review what just happened.
We had two separate models working on one LM studio software running.
It was open source.
It was free.
We didn't have to worry about open AI's API key and they could talk to each other.
I think this was a huge update and I think this is really going to help out especially if you
Or if you haven't tried it yet, I recommend you downloading it and just trying it.
It's free.
You know, they don't store any of your information, you can use all open source local LLMs.
If you have any comments or anything you want to chat about, we'll leave them down in the below.
Thank you for watching and I'll see you next

Mở khóa nhiều tính năng hơn

Cài đặt tiện ích Trancy để mở khóa nhiều tính năng hơn, bao gồm phụ đề AI, định nghĩa từ AI, phân tích ngữ pháp AI, nói chuyện AI, v.v.

feature cover

Tương thích với các nền tảng video chính

Trancy không chỉ cung cấp hỗ trợ phụ đề song ngữ cho các nền tảng như YouTube, Netflix, Udemy, Disney+, TED, edX, Kehan, Coursera, mà còn cung cấp dịch từ/câu bằng AI, dịch toàn văn sâu sắc và các tính năng khác cho các trang web thông thường. Đây là một trợ lý học ngôn ngữ đa năng thực sự.

Trình duyệt trên tất cả các nền tảng

Trancy hỗ trợ tất cả các trình duyệt trên tất cả các nền tảng, bao gồm tiện ích trình duyệt iOS Safari.

Nhiều chế độ xem

Hỗ trợ chế độ xem rạp, đọc, kết hợp và các chế độ xem khác để có trải nghiệm song ngữ toàn diện.

Nhiều chế độ luyện tập

Hỗ trợ luyện viết câu, đánh giá nói, trắc nghiệm nhiều lựa chọn, viết theo mẫu và các chế độ luyện tập khác.

Tóm tắt video AI

Sử dụng OpenAI để tóm tắt video và nắm bắt nhanh nội dung chính.

Phụ đề AI

Tạo phụ đề AI chính xác và nhanh chóng trên YouTube chỉ trong 3-5 phút.

Định nghĩa từ AI

Chạm vào từ trong phụ đề để tra cứu định nghĩa, với định nghĩa dựa trên AI.

Phân tích ngữ pháp AI

Phân tích ngữ pháp câu để nhanh chóng hiểu ý nghĩa câu và nắm vững các điểm ngữ pháp khó.

Nhiều tính năng web khác

Ngoài phụ đề song ngữ cho video, Trancy còn cung cấp dịch từ và dịch toàn văn cho các trang web.

Sẵn sàng để bắt đầu

Hãy thử Trancy ngay hôm nay và trải nghiệm các tính năng độc đáo của nó cho chính bạn

Tải về