Lapho unquma ukufaka ithempulethi ye-Mixtral engahloliwe ye-AI kukhompyutha yakho, uthola ukufinyelela kubuhlakani bokwenziwa obuyindida obuklanyelwe ukwedlula amanye amamodeli ekilasini lawo. Eyaziwa ngokuthi i-Mixtral 8x7B, le AI inohlaka lwamapharamitha ayizigidi eziyizinkulungwane ezingu-7, eyivumela ukuthi isebenze ngesivinini esimangalisayo nokusebenza kahle. Leli thuluzi alisheshi nje kuphela, kodwa futhi lisekela izilimi eziningi futhi lingakwazi ukukhiqiza ikhodi kahle, lilenze libe yinketho ephezulu kubathuthukisi namabhizinisi afuna isiqalo.
I-Mixtral 8x7B imodeli yekhwalithi ephezulu ye-Sparse Expert Mix (SMoE) enesisindo esivulekile. Ilayisensi ngaphansi kwe-Apache 2.0. I-Mixtral idlula i-Llama 2 70B kumabhentshimakhi amaningi anokuqonda okushesha okungu-6x. Iyimodeli enamandla kakhulu yesisindo esivulekile enelayisensi evumela futhi iyimodeli engcono kakhulu uma kuziwa ekuhwebeni kwezindleko/kokusebenza. Ikakhulukazi, ifana noma idlula i-GPT3.5 kumabhentshimakhi ajwayelekile.
Inguqulo esekelwe ku-Mixtral-8x7b ithi Dolphin 2.5 Mixtral, ithuthukiswe ngedathasethi ekhethekile eyisiza igweme ukuchema nezinkinga zokuqondanisa, okuyenza inguqulo engahloliwe. Lokhu kusho ukuthi i-AI ayisebenzi nje kuphela, kodwa futhi ilungile futhi ingasetshenziswa ezinhlobonhlobo zezinhlelo zokusebenza ngaphandle kokukhetha iqembu elilodwa ngaphezu kwelinye. Imodeli eyisisekelo inomongo we-32k futhi imodeli ethuthukisiwe inomongo we-16k. Okusha ku-Dolphin 2.5 Mixtral okuphinde kube “ Kuhle kakhulu » ekubhaleni ngekhodi uthi umdali wakho:
- Isusa u-Samantha no-WizardLM
- Kwengezwe i-Synthia, i-OpenHermes ne-PureDove
- Kwengezwe idathasethi entsha ye-Dolphin-Coder
- Kwengezwe idathasethi ye-MagiCoder
Ukukhetha i-Mixtral kusho ukukhetha i-AI enikeza ukusebenza okuphezulu. Ubunkimbinkimbi bayo buqhathaniswa namamodeli amakhulu kakhulu, futhi izikhathi zayo zokuphendula ngokushesha zibalulekile kumaphrojekthi azwela isikhathi. Ikhono le-AI lokuphatha izilimi eziningi liyenza ibe ithuluzi elibalulekile lamabhizinisi asebenza emhlabeni wonke. Ukwengeza, amandla ayo okukhiqiza amakhodi asiza ukwenza imisebenzi ngokuzenzakalelayo, ukuthuthukisa ukukhiqiza nokwenza izinqubo zomsebenzi zisebenze kahle.
Faka i-Mixtral ingahloliwe endaweni ukuze uthole ubumfihlo nokuphepha okungeziwe.
Ukuze ufunde ukuthi ungayifaka kanjani inguqulo engahloliwe ye-Mixtral kukhompuyutha yakho yasendaweni noma kunethiwekhi yasekhaya, hlola okokufundisa okudalwe ithimba Lomhlaba We-AI, okuzokuqondisa isinyathelo ngesinyathelo.
Ukukhishwa kwe-Mixtral's Dolphin 2.5 kumelela intuthuko enkulu kubuchwepheshe be-AI. Ihlinzeka ngenkundla engathathi hlangothi ngokubhekana nezindaba zokuchema nokuqondanisa, okubaluleke kakhulu emhlabeni wanamuhla ohlukahlukene. Kodwa-ke, ngaphambi kokuthi uqale inqubo yokufaka, kubalulekile ukuhlola ukuthi ihadiwe yakho isesimweni esikahle. Ukuba ne-RAM eyanele kubalulekile ukuze i-AI isebenze kahle, futhi inani olidingayo lizoncika ekutheni uzobe usebenzisa i-AI ukuze uyisebenzisele wena noma usebenzise iseva.
Ukukusiza ngokufaka, i-LM Studio iyiwizadi eyenza kube lula ukwenza i-Mixtral isebenze emshinini wakho. Idizayinelwe ukuthi isebenziseke kalula, ngakho-ke ngisho nabantu abanolwazi olulinganiselwe lobuchwepheshe bangakwazi ukuphatha inqubo yokufaka.
Ukuze uthole okuningi ku-Mixtral, ungasebenzisa izindlela ezihlukene zokulinganisa ukuze uthuthukise ukusebenza kwayo. Lezi zindlela ziyakwazi ukuzivumelanisa nezimo ezihlukene, kusukela kumakhompyutha womuntu siqu kuya kumaseva amakhulu, okuqinisekisa ukuthi i-AI isebenza ngokuphumelelayo ngangokunokwenzeka.
Kubalulekile futhi ukuqaphela ukucatshangelwa kwezimiso zokuziphatha nezomthetho lapho usebenzisa i-Dolphin 2.5 Mixtral. Uma kubhekwa isimo esingahloliwe sesifanekiso, kubalulekile ukusisebenzisa ngokuzibophezela ukuze ugweme noma yimiphi imiphumela engemihle.
Ngokufaka imodeli ye-Mixtral AI emshinini wangakini, uvula umhlaba wamathuba amaphrojekthi akho. Ukusebenza kwayo okukhethekile, ukuhlukahluka kokusekelwa kolimi nokusebenza kahle kwayo kokubhalwa kwekhodi kwenza i-Mixtral ibe ithuluzi le-AI elesabekayo. Ukuhlangabezana nezidingo zezingxenyekazi zekhompuyutha nokusebenzisa i-LM Studio ukuze uyifake kuzokusiza ukuthi usebenzise ngokugcwele lokho okuhlinzekwa yi-Mixtral AI. Khumbula ukuhlale ucabangela izibopho zesimilo nezomthetho ezihambisana nokusebenzisa imodeli ye-AI engahloliwe ukuze uqiniseke ukuthi ukusetshenziswa kwayo kunomthwalo wemfanelo futhi kunenzuzo.
Ikhredithi yesithombe: Umhlaba we-AI
Funda kabanzi Umhlahlandlela:
- Uyicwenga kanjani imodeli ye-Mixtral 8x7B Mistral Ai Mixture of Experts (MoE) AI
- Ihlola ukusebenza okumangalisayo kwe-AI Agent Mixtral 8X7B
- Uyicwenga kanjani imodeli ye-Mixtral yomthombo ovulekile we-AI
- Imodeli entsha ye-Zephyr-7B LLM AI icolisa i-Mistral-7B futhi yehlula i-Llama-2 70B
- Ungayifaka kanjani i-Ollama endaweni ukuze usebenzise i-llama 2, ikhodi ye-llama namanye amamodeli we-LLM?
- Uchwepheshe we-Mistral AI Mixtral 8x7B ohlanganisa imodeli yobuhlakani bokwenziwa kuye kwavezwa amabhentshimakhi ahlaba umxhwele