donphan.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
This generalist Mastodon server welcomes enthusiasts of the Pokémon franchise, to talk about it or anything else. Join the federation!

Server stats:

130
active users

#gemma3

2 posts2 participants0 posts today
Devin Prater :blind:<p>So I've been using Gemma3:27B a lot lately. And I've noticed some kinda cool things about it. It has patterns! Well, more like templates that it uses to work through common stuff. So, for image descriptions, it'll give an overview, go into a bulleted detailed list, and then summarize. And with giving advice, it'll do this:</p><p>Here's a breakdown of suggestions, categorized by the issues you've identified, with a mix of quick wins and longer-term strategies. I'll also indicate which ones might be easier to implement right away (⭐).</p><p>So yeah pretty cool for a model that doesn't use reasoning-type tokens and such.</p><p><a href="https://tweesecake.social/tags/AI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AI</span></a> <a href="https://tweesecake.social/tags/Gemma3" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Gemma3</span></a> <a href="https://tweesecake.social/tags/Gemma" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Gemma</span></a> <a href="https://tweesecake.social/tags/google" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>google</span></a> <a href="https://tweesecake.social/tags/LLM" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LLM</span></a></p>
Mike Stone<p>Testing out the newly released <a href="https://fosstodon.org/tags/Gemma3" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Gemma3</span></a> model locally on <a href="https://fosstodon.org/tags/ollama" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ollama</span></a>. This is one of the more frustration aspects of these LLMs. It must be said that LLMs are fine for what they are, and what they are is a glorified autocomplete. They have their uses (just like autocomplete does), but if you try to use them outside of their strengths your results are going to be less than reliable.</p>