💥 Gate Square Event: #PostToWinTRUST 💥
Post original content on Gate Square related to TRUST or the CandyDrop campaign for a chance to share 13,333 TRUST in rewards!
📅 Event Period: Nov 6, 2025 – Nov 16, 2025, 16:00 (UTC)
📌 Related Campaign:
CandyDrop 👉 https://www.gate.com/announcements/article/47990
📌 How to Participate:
1️⃣ Post original content related to TRUST or the CandyDrop event.
2️⃣ Content must be at least 80 words.
3️⃣ Add the hashtag #PostToWinTRUST
4️⃣ Include a screenshot showing your CandyDrop participation.
🏆 Rewards (Total: 13,333 TRUST)
🥇 1st Prize (1 winner): 3,833
Microsoft Open Source New Version of Phi-4: Inference Efficiency Rises 10 Times, Can Run on Laptops
Jin10 data reported on July 10, this morning, Microsoft open sourced the latest version of the Phi-4 family, Phi-4-mini-flash-reasoning, on its official website. The mini-flash version continues the Phi-4 family’s characteristics of small parameters and strong performance, specifically designed for scenarios limited by Computing Power, memory, and latency, capable of running on a single GPU, suitable for edge devices like laptops and tablets. Compared to the previous version, mini-flash utilizes Microsoft’s self-developed innovative architecture, SambaY, resulting in a big pump in inference efficiency by 10 times, with average latency reduced by 2-3 times, achieving a significant improvement in overall inference performance.