<!DOCTYPE html>
<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
</head>
<body>
<br>
<blockquote type="cite"
cite="mid:CAEnL8_cdh-tijbZ9xsF5sB-nA1JY6GMHQuat=HO4=T4Re=qdkg@mail.gmail.com">
<meta http-equiv="content-type" content="text/html; charset=UTF-8">
<div dir="auto">Probably will at some point. I entered this
contest, and I'm waiting to hear back if I'm getting cloud
access or hardware:
<div dir="auto"><br>
</div>
<div dir="auto"><a
href="https://www.hackster.io/contests/amd2023#challengeNav"
rel="noreferrer noreferrer" target="_blank"
moz-do-not-send="true" class="moz-txt-link-freetext">https://www.hackster.io/contests/amd2023#challengeNav</a><br>
</div>
<div dir="auto"><br>
</div>
<div dir="auto">This is my proposal:</div>
<div dir="auto"><a
href="https://www.hackster.io/contests/amd2023/hardware_applications/16336"
target="_blank" rel="noreferrer" moz-do-not-send="true"
class="moz-txt-link-freetext">https://www.hackster.io/contests/amd2023/hardware_applications/16336</a><br>
</div>
<div dir="auto"><br>
</div>
<div dir="auto">Contest is still open if you want to
participate, but the hardware requests are finished.</div>
<div dir="auto"><br>
</div>
<div dir="auto">You can download most of the Open Source models
here: <a href="https://huggingface.co/models"
rel="noreferrer noreferrer" target="_blank"
moz-do-not-send="true" class="moz-txt-link-freetext">https://huggingface.co/models</a></div>
<div dir="auto"><br>
</div>
<div dir="auto">Good article on how to get started:</div>
<div dir="auto"><a
href="https://www.philschmid.de/fine-tune-llms-in-2024-with-trl"
target="_blank" rel="noreferrer" moz-do-not-send="true"
class="moz-txt-link-freetext">https://www.philschmid.de/fine-tune-llms-in-2024-with-trl</a></div>
</div>
</blockquote>
<p>I suggest we apply the same description language to humans. For
example, here is a 18 year old chap or gal, and he/she shall be
fine tuned to <insert_the_discipline> using this particular
input data, and our helpful tutors, ... ?, data engineers :) .<br>
</p>
<blockquote type="cite"
cite="mid:CAEnL8_cdh-tijbZ9xsF5sB-nA1JY6GMHQuat=HO4=T4Re=qdkg@mail.gmail.com">
<div class="gmail_quote">
<blockquote class="gmail_quote"
style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">I
am listening to podcast <br>
<a
href="https://oxide.computer/podcasts/oxide-and-friends/1692510"
rel="noreferrer noreferrer noreferrer noreferrer"
target="_blank" moz-do-not-send="true"
class="moz-txt-link-freetext">https://oxide.computer/podcasts/oxide-and-friends/1692510</a>
about Large <br>
Language Models, and idea comes up, if anyone of you have
tried any <br>
locally usable LLM's, have trained it, tinkered with it, just
make a <br>
"let me show" kind of talk. Seriously.<br>
</blockquote>
</div>
</blockquote>
</body>
</html>