<!DOCTYPE html>
<html>
  <head>
    <meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
  </head>
  <body>
    <br>
    <blockquote type="cite"
cite="mid:CAEnL8_cdh-tijbZ9xsF5sB-nA1JY6GMHQuat=HO4=T4Re=qdkg@mail.gmail.com">
      <meta http-equiv="content-type" content="text/html; charset=UTF-8">
      <div dir="auto">Probably will at some point.  I entered this
        contest, and I'm waiting to hear back if I'm getting cloud
        access or hardware:
        <div dir="auto"><br>
        </div>
        <div dir="auto"><a
            href="https://www.hackster.io/contests/amd2023#challengeNav"
            rel="noreferrer noreferrer" target="_blank"
            moz-do-not-send="true" class="moz-txt-link-freetext">https://www.hackster.io/contests/amd2023#challengeNav</a><br>
        </div>
        <div dir="auto"><br>
        </div>
        <div dir="auto">This is my proposal:</div>
        <div dir="auto"><a
href="https://www.hackster.io/contests/amd2023/hardware_applications/16336"
            target="_blank" rel="noreferrer" moz-do-not-send="true"
            class="moz-txt-link-freetext">https://www.hackster.io/contests/amd2023/hardware_applications/16336</a><br>
        </div>
        <div dir="auto"><br>
        </div>
        <div dir="auto">Contest is still open if you want to
          participate, but the hardware requests are finished.</div>
        <div dir="auto"><br>
        </div>
        <div dir="auto">You can download most of the Open Source models
          here:  <a href="https://huggingface.co/models"
            rel="noreferrer noreferrer" target="_blank"
            moz-do-not-send="true" class="moz-txt-link-freetext">https://huggingface.co/models</a></div>
        <div dir="auto"><br>
        </div>
        <div dir="auto">Good article on how to get started:</div>
        <div dir="auto"><a
href="https://www.philschmid.de/fine-tune-llms-in-2024-with-trl"
            target="_blank" rel="noreferrer" moz-do-not-send="true"
            class="moz-txt-link-freetext">https://www.philschmid.de/fine-tune-llms-in-2024-with-trl</a></div>
        <br>
      </div>
    </blockquote>
    <p>The following heading in this llm fine tuning page makes me laugh
      and cry simultaneously, `with(xz)`:</p>
    <p>"""</p>
    <p>    2. Setup development environment</p>
    <p>"""<br>
    </p>
    <blockquote type="cite"
cite="mid:CAEnL8_cdh-tijbZ9xsF5sB-nA1JY6GMHQuat=HO4=T4Re=qdkg@mail.gmail.com">
      <div dir="auto">
        <blockquote class="gmail_quote"
style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">I
          am listening to podcast <br>
          <a
href="https://oxide.computer/podcasts/oxide-and-friends/1692510"
            rel="noreferrer noreferrer noreferrer noreferrer"
            target="_blank" moz-do-not-send="true"
            class="moz-txt-link-freetext">https://oxide.computer/podcasts/oxide-and-friends/1692510</a>
          about Large <br>
          Language Models, and idea comes up, if anyone of you have
          tried any <br>
          locally usable LLM's, have trained it, tinkered with it, just
          make a <br>
          "let me show" kind of talk. Seriously.<br>
        </blockquote>
      </div>
    </blockquote>
  </body>
</html>