rss.livelink.threads-in-node
https://techcommunity.microsoft.com/t5/
Microsoft Community HubThu, 19 Jun 2025 09:05:48 GMTCommunity2025-06-19T09:05:48ZMicrosoft Sentinel - Alert suppression
https://techcommunity.microsoft.com/t5/microsoft-sentinel/microsoft-sentinel-alert-suppression/m-p/4425433#M12726
<P>Hello Tech Community, </P><P> </P><P>Working with Microsoft Sentinel, sometimes, we have to suppress alerts based on information about UPN, IP, hostname, and other. </P><P>Let's imagine we need to suppress 20 combinations of UPN, IP hostname. Sometimes, sometimes, the suppressions fields should be empty or should be wildcarded (meaning it can be any value in the log that should be suppressed). </P><P>What is the best way to suppress alerts?</P><P> - Automation rules - seems not flexible and works only with entities. </P><P> - Watchlist with "join" or "where" operator - good option, but doesn't support * (wildcard)</P><P> - Hardcoded in KQL - not flexible, especially when you have SDLC processes</P><P>Please, your ideas and advice.</P>Thu, 19 Jun 2025 08:54:19 GMThttps://techcommunity.microsoft.com/t5/microsoft-sentinel/microsoft-sentinel-alert-suppression/m-p/4425433#M12726mikhailf2025-06-19T08:54:19ZIs there any free program to convert mp4 to wav?
https://techcommunity.microsoft.com/t5/windows-insider-program/is-there-any-free-program-to-convert-mp4-to-wav/m-p/4425429#M35895
<P>Hi,</P><P>I'm relatively new to audio editing and file conversion, and I’m currently working on a project that requires me to extract high-quality audio from an MP4 video file. Specifically, I need the audio in WAV format because of its uncompressed quality, which is important for the next steps in my workflow.</P><P>I did some searching online, but there are so many <STRONG>mp4 to wav</STRONG> converter tools and methods available, and not sure which one is the most reliable and easiest for a beginner like me to use. Could anyone recommend a straightforward way to convert an MP4 file to WAV?</P>Thu, 19 Jun 2025 08:42:41 GMThttps://techcommunity.microsoft.com/t5/windows-insider-program/is-there-any-free-program-to-convert-mp4-to-wav/m-p/4425429#M35895Luocna2025-06-19T08:42:41ZLearning Azure
https://techcommunity.microsoft.com/t5/windows-11/learning-azure/m-p/4425425#M27183
<P>I'd like to learn Azure. Just for fun. But I don't know if I should go with the Microsoft online course or with something else. Any suggestions?</P>Thu, 19 Jun 2025 08:15:20 GMThttps://techcommunity.microsoft.com/t5/windows-11/learning-azure/m-p/4425425#M27183OhioValley2025-06-19T08:15:20ZHow to know about gradually rolling new features on my W11 computer?
https://techcommunity.microsoft.com/t5/windows-11/how-to-know-about-gradually-rolling-new-features-on-my-w11/m-p/4425424#M27182
<P>There are dozens of new W11 features that are communicated to be gradually rolling out. BUT... How can I get *notified* (and how?) when a new feature (say A, B or C) has happily landed (been enabled) on MY computer and is available for use? Don't wanna spend time on checking is there something new or not...</P><P>And no, I'm not talking about following the release / roll-out status from the Microsoft Windows Roadmap web pages.</P><P> </P>Thu, 19 Jun 2025 08:14:39 GMThttps://techcommunity.microsoft.com/t5/windows-11/how-to-know-about-gradually-rolling-new-features-on-my-w11/m-p/4425424#M27182DeclanGray2025-06-19T08:14:39ZHow to Convert exFAT to NTFS on Windows 11 Without Losing Data
https://techcommunity.microsoft.com/t5/windows-11/how-to-convert-exfat-to-ntfs-on-windows-11-without-losing-data/m-p/4425411#M27173
<P>I have an external hard drive currently formatted as exFAT, but I need to switch it to NTFS for better Windows compatibility (file permissions, compression, etc.). I know Windows doesn't have a direct convert command for exFAT like it does for FAT32 to NTFS, so I'm looking for the safest method.</P><P>Here's what I’ve considered:</P><UL><LI>Backup & Reformat – Copy all data elsewhere, reformat the drive as NTFS, then move files back.</LI><LI>Third-party tools – Are there any reliable tools that can <STRONG>convert exfat to ntfs</STRONG> without data loss?</LI><LI>Command-line method – Does diskpart or format have a hidden way to do this?</LI></UL><P>Has anyone successfully done this before? What's the best approach to avoid data loss?</P><P>Thanks in advance!</P>Thu, 19 Jun 2025 07:32:33 GMThttps://techcommunity.microsoft.com/t5/windows-11/how-to-convert-exfat-to-ntfs-on-windows-11-without-losing-data/m-p/4425411#M27173Eamop2025-06-19T07:32:33ZOpenTelemetry in Azure Logic Apps (Standard and Hybrid)
https://techcommunity.microsoft.com/t5/azure-integration-services-blog/opentelemetry-in-azure-logic-apps-standard-and-hybrid/ba-p/4425403
<H4>Why OpenTelemetry?</H4>
<P data-start="340" data-end="590">As modern applications become more distributed and complex, robust observability is no longer optional—it is essential. Organizations need a consistent way to understand how workflows are performing, trace failures, and optimize end-to-end execution.</P>
<P data-start="592" data-end="921"><STRONG data-start="592" data-end="609">OpenTelemetry</STRONG> provides a unified, vendor-agnostic framework for collecting telemetry data—logs, metrics, and traces—across different services and infrastructure layers. It simplifies monitoring and makes it easier to integrate with a variety of observability backends such as Azure Monitor, Grafana Tempo, Jaeger, and others.</P>
<P data-start="923" data-end="1131">For Logic Apps—especially when deployed in <STRONG data-start="966" data-end="976">hybrid</STRONG> or <STRONG data-start="980" data-end="995">on-premises</STRONG> scenarios—OpenTelemetry is a powerful addition that elevates diagnostic capabilities beyond the default Application Insights telemetry.</P>
<H4 data-start="923" data-end="1131">What is OpenTelemetry?</H4>
<P data-start="162" data-end="400">OpenTelemetry (OTel) is an open-source observability framework under the Cloud Native Computing Foundation (CNCF) that provides a unified standard for generating, collecting, and exporting telemetry data such as logs, metrics, and traces.</P>
<P data-start="402" data-end="679">By abstracting away vendor-specific instrumentation and enabling interoperability across various tools and platforms, OpenTelemetry empowers developers and operators to gain deep visibility into distributed systems—regardless of the underlying infrastructure or language stack.</P>
<P data-start="681" data-end="956">In the context of Azure Logic Apps, OpenTelemetry support enables standardized, traceable telemetry that can integrate seamlessly with a wide range of observability solutions. This helps teams monitor, troubleshoot, and optimize workflows with more precision and flexibility.</P>
<H4 data-start="681" data-end="956">How to Configure from Visual Studio Code?</H4>
<P data-start="2066" data-end="2156">To configure OpenTelemetry for a Logic App (Standard) project from <STRONG data-start="2133" data-end="2155">Visual Studio Code</STRONG>:</P>
<OL data-start="2158" data-end="2290">
<LI data-start="2158" data-end="2227">Locate the host.json file in the root of your Logic App project.</LI>
<LI data-start="2228" data-end="2290">Enable OpenTelemetry by adding <STRONG>"telemetryMode": "OpenTelemetry"</STRONG> <SPAN style="color: rgb(30, 30, 30);">at the root level of the file.</SPAN><LI-CODE lang="json">{
"version": "2.0",
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle.Workflows",
"version": "[1.*, 2.0.0)"
},
"telemetryMode": "OpenTelemetry"
}</LI-CODE></LI>
<LI data-start="2228" data-end="2290">
<P>Define the following <STRONG data-start="2511" data-end="2535">application settings</STRONG> in local.settings.json or within your CI/CD deployment pipeline:</P>
<UL data-start="2604" data-end="2781">
<LI data-start="2604" data-end="2683">OTEL_EXPORTER_OTLP_ENDPOINT: The OTLP exporter endpoint URL where the telemetry data should be sent.</LI>
<LI data-start="2684" data-end="2781">OTEL_EXPORTER_OTLP_HEADERS <EM data-start="2715" data-end="2727">(optional)</EM>: A list of headers to apply to all outgoing data. This is commonly used to pass authentication keys or tokens to your observability backend.</LI>
</UL>
<P>If your endpoint requires additional OpenTelemetry-related settings, include those in the application settings as well. Refer to the official <A class="lia-external-url" href="https://opentelemetry.io/docs/languages/sdk-configuration/otlp-exporter/" target="_blank" rel="noopener">OTLP Exporter Configuration documentation</A> for details.</P>
</LI>
</OL>
<H4>How to Configure OpenTelemetry from Azure Portal? – Standard Logic Apps</H4>
<P>To enable OpenTelemetry support for a <STRONG data-start="247" data-end="269">Standard Logic App</STRONG> hosted using either a <STRONG data-start="292" data-end="318">Workflow Standard Plan</STRONG> or <STRONG data-start="322" data-end="352">App Service Environment v3</STRONG>, follow the steps below:</P>
<H5 data-start="379" data-end="417"><STRONG data-start="383" data-end="417">1. Update the host.json File</STRONG></H5>
<OL data-start="419" data-end="1216">
<LI data-start="419" data-end="520">In the <A href="https://portal.azure.com" target="_blank" rel="noopener" data-start="429" data-end="469">Azure portal</A>, navigate to your <STRONG data-start="488" data-end="510">Standard Logic App</STRONG> resource.</LI>
<LI data-start="521" data-end="636">In the left-hand menu, under <STRONG data-start="553" data-end="574">Development Tools</STRONG>, select <STRONG data-start="583" data-end="606">Advanced Tools > Go</STRONG>. This opens the Kudu console.</LI>
<LI data-start="637" data-end="736">In Kudu, from the <STRONG data-start="658" data-end="675">Debug Console</STRONG> menu, select <STRONG data-start="689" data-end="696">CMD</STRONG>, and navigate to:<BR data-start="714" data-end="717" />site > wwwroot</LI>
<LI data-start="737" data-end="794">Locate and open the host.json file in a text editor.</LI>
<LI data-start="795" data-end="1176">Add the following configuration at the root level of the file to enable OpenTelemetry, then save and close the editor.<LI-CODE lang="json">{
"version": "2.0",
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle.Workflows",
"version": "[1.*, 2.0.0)"
},
"telemetryMode": "OpenTelemetry"
}</LI-CODE></LI>
</OL>
<H5 data-start="1218" data-end="1272"><STRONG data-start="1222" data-end="1272">2. Configure App Settings for Telemetry Export</STRONG></H5>
<OL data-start="1274" data-end="1993">
<LI data-start="1274" data-end="1386">Still within your Logic App resource, go to <STRONG data-start="1321" data-end="1357">Settings > Environment Variables</STRONG> and select <STRONG data-start="1369" data-end="1385">App settings</STRONG>.</LI>
<LI data-start="1387" data-end="1945">Add the following key-value pairs:
<DIV class="styles_lia-table-wrapper__h6Xo9 styles_table-responsive__MW0lN"><table border="1" style="border-width: 1px;"><thead><tr><th>App Setting</th><th>Description</th></tr></thead><tbody><tr><td>OTEL_EXPORTER_OTLP_ENDPOINT</td><td>The OTLP (OpenTelemetry Protocol) endpoint URL where telemetry data will be exported. For example: https://otel.your-observability-platform.com</td></tr><tr><td>OTEL_EXPORTER_OTLP_HEADERS <EM>(Optional)</EM></td><td>Any custom headers required by your telemetry backend, such as an Authorization token (e.g., Authorization=Bearer <key>).</td></tr></tbody></table></DIV>
<P> </P>
</LI>
<LI data-start="1387" data-end="1945">Select <STRONG data-start="1957" data-end="1966">Apply</STRONG> to save the configuration.</LI>
</OL>
<H4>How to Configure OpenTelemetry from Azure Portal? – Hybrid Logic Apps</H4>
<P>To enable OpenTelemetry support for a <STRONG data-start="259" data-end="313">Standard Logic App using the Hybrid hosting option</STRONG>, follow the steps below. This configuration enables telemetry collection and export from an on-premises deployment, using environment variables and local file system access.</P>
<H5><STRONG>1. Modify host.json on the SMB Share</STRONG></H5>
<OL>
<LI data-start="537" data-end="635">On your on-premises file share (SMB), navigate to the root directory of your Logic App project.</LI>
<LI data-start="636" data-end="667">Locate the host.json file.</LI>
<LI data-start="668" data-end="1013">Add the following configuration to enable OpenTelemetry and save the file.<LI-CODE lang="json">{
"version": "2.0",
"extensionBundle": {
"id": "Microsoft.Azure.Functions.ExtensionBundle.Workflows",
"version": "[1.*, 2.0.0)"
},
"telemetryMode": "OpenTelemetry"
}</LI-CODE></LI>
</OL>
<H5 data-start="1057" data-end="1115"><STRONG data-start="1061" data-end="1115">2. Configure Environment Variables in Azure Portal</STRONG></H5>
<OL data-start="1117" data-end="2274">
<LI data-start="1117" data-end="1233">Go to the <A href="https://portal.azure.com" target="_blank" rel="noopener" data-start="1130" data-end="1170">Azure Portal</A> and navigate to your <STRONG data-start="1192" data-end="1223">Standard Logic App (Hybrid)</STRONG> resource.</LI>
<LI data-start="1234" data-end="1330">From the left-hand menu, select <STRONG data-start="1269" data-end="1294">Settings > Containers</STRONG>, then click on <STRONG data-start="1310" data-end="1329">Edit and deploy</STRONG>.</LI>
<LI data-start="1331" data-end="2212">In the <STRONG data-start="1341" data-end="1361">Edit a container</STRONG> pane, select <STRONG data-start="1375" data-end="1400">Environment variables</STRONG>, and then click <STRONG data-start="1417" data-end="1424">Add</STRONG> to define the following:
<DIV class="styles_lia-table-wrapper__h6Xo9 styles_table-responsive__MW0lN"><table class="lia-border-style-solid" border="1" style="border-width: 1px;"><thead><tr><th>Name</th><th>Source</th><th>Value</th><th>Description</th></tr></thead><tbody><tr><td>OTEL_EXPORTER_OTLP_ENDPOINT</td><td>Manual</td><td><OTLP-endpoint-URL></td><td>The OTLP exporter endpoint URL where telemetry should be sent. Example: https://otel.yourbackend.com</td></tr><tr><td>OTEL_EXPORTER_OTLP_HEADERS <EM>(Optional)</EM></td><td>Manual</td><td><OTLP-headers></td><td>Custom headers (e.g., Authorization=Bearer <token>) required by your observability backend.</td></tr></tbody></table></DIV>
<img />
<P> </P>
</LI>
<LI data-start="1331" data-end="2212">Once you've added all necessary settings, click <STRONG data-start="2265" data-end="2273">Save</STRONG>.</LI>
</OL>
<H4>Example of Endpoint Configuration & How to Check Logs</H4>
<P>To export telemetry data using OpenTelemetry, configure the following environment variables in your Logic App’s application settings or container environment:</P>
<DIV class="styles_lia-table-wrapper__h6Xo9 styles_table-responsive__MW0lN"><table class="lia-border-style-solid" border="1" style="border-width: 1px;"><thead><tr><th>Name</th><th>Source</th><th>Value</th><th>Description</th></tr></thead><tbody><tr><td>OTEL_EXPORTER_OTLP_ENDPOINT</td><td>Manual Entry</td><td>https://otel.kloudmate.com:4318</td><td>The OTLP receiver endpoint for your observability backend.</td></tr><tr><td>OTEL_EXPORTER_OTLP_HEADERS</td><td>Manual Entry</td><td>Authorization=<your-api-key></td><td>Used to authenticate requests to the telemetry backend.</td></tr><tr><td>OTEL_EXPORTER_OTLP_PROTOCOL</td><td>Manual Entry</td><td>http/protobuf</td><td>Protocol used for exporting telemetry (KloudMate supports gRPC/HTTP).</td></tr></tbody></table></DIV>
<img />
<P>In this example, we are using <STRONG data-start="1376" data-end="1389">KloudMate</STRONG> as the destination for telemetry data. Once correctly configured, your Logic App will begin exporting telemetry data to KloudMate as shown below:</P>
<img />
<H4>Limitations and Troubleshooting Steps</H4>
<H5><STRONG data-start="4727" data-end="4750">Current Limitations</STRONG></H5>
<UL data-start="4752" data-end="5105">
<LI data-start="4752" data-end="4853">Supported trigger types for OpenTelemetry in Logic Apps are:
<UL data-start="4817" data-end="4853">
<LI data-start="4817" data-end="4823">HTTP</LI>
<LI data-start="4826" data-end="4839">Service Bus</LI>
<LI data-start="4842" data-end="4853">Event Hub</LI>
</UL>
</LI>
<LI data-start="4854" data-end="4905">Exporting <STRONG data-start="4866" data-end="4877">metrics</STRONG> is not currently supported.</LI>
</UL>
<H5 data-start="5107" data-end="5136"><STRONG data-start="5111" data-end="5136">Troubleshooting Steps</STRONG></H5>
<UL data-start="5138" data-end="5683">
<LI data-start="5138" data-end="5300"><STRONG data-start="5140" data-end="5162">No traces received</STRONG>:
<UL data-start="5166" data-end="5300">
<LI data-start="5166" data-end="5233">Validate OTEL_EXPORTER_OTLP_ENDPOINT URL and port availability.</LI>
<LI data-start="5236" data-end="5300">Ensure outbound traffic to observability backend is permitted.</LI>
</UL>
</LI>
<LI data-start="5301" data-end="5399"><STRONG data-start="5303" data-end="5328">Authentication issues</STRONG>:
<UL data-start="5332" data-end="5399">
<LI data-start="5332" data-end="5399">Review and correct header values in OTEL_EXPORTER_OTLP_HEADERS.</LI>
</UL>
</LI>
</UL>
<H4>References</H4>
<P><SPAN data-teams="true"><A class="lia-external-url" href="https://review.learn.microsoft.com/en-us/azure/logic-apps/enable-enhanced-telemetry-standard-workflows?branch=main&tabs=portal#set-up-opentelemetry-for-performance-monitoring" aria-label="Link Set up and view enhanced telemetry for Standard workflows - Azure Logic Apps | Microsoft Learn" target="_blank">Set up and view enhanced telemetry for Standard workflows - Azure Logic Apps | Microsoft Learn</A></SPAN></P>Thu, 19 Jun 2025 07:30:52 GMThttps://techcommunity.microsoft.com/t5/azure-integration-services-blog/opentelemetry-in-azure-logic-apps-standard-and-hybrid/ba-p/4425403harshulmidha2025-06-19T07:30:52ZPossible bug in WIN 24H2
https://techcommunity.microsoft.com/t5/windows-11/possible-bug-in-win-24h2/m-p/4425409#M27171
<P>If I have 3 File Explorer windows open and If I have a program maximized and I want to select a File Explorer window. I hover the mouse over This PC icon shortcut on my Taskbar it it shows the 3 File Explorer windows. If I move the mouse over to a file explorer window and click on it. it selects the window and now it's on top of the maximized program. So I can view the File Explorer and drag a file into the maximized program.</P>Thu, 19 Jun 2025 07:23:34 GMThttps://techcommunity.microsoft.com/t5/windows-11/possible-bug-in-win-24h2/m-p/4425409#M27171XanderHawkhill2025-06-19T07:23:34ZWin 11 24H2 can not reboot... but can boot?
https://techcommunity.microsoft.com/t5/windows-11/win-11-24h2-can-not-reboot-but-can-boot/m-p/4425405#M27169
<P>If I use Windows Button -> Power -> Reboot, Windows 11 reboots and afterwards (not at the "Restarting" screen, but the boot logo) hangs on the rotating loading circle indefinitely.</P><P>If I shut down and start up again, it works perfectly fine. If I hold the power button during the infinite loading circle, it shuts down and then boots up perfectly fine, too.</P><P>Event manager has an entry about not shutting down cleanly after force-shutdown by power button, but no error messages about what caused it to hang.<BR /><BR />Any ideas where to look? This used to work normally until about 2 weeks ago, then suddenly showed this behavior.<BR /><BR />Both sfc /scannow and dism /online /cleanup-image /restorehealth (and /scanhealth) show no issues at all. Neither does scanning the system drive.</P><P> </P>Thu, 19 Jun 2025 07:22:28 GMThttps://techcommunity.microsoft.com/t5/windows-11/win-11-24h2-can-not-reboot-but-can-boot/m-p/4425405#M27169Luccask2025-06-19T07:22:28ZQuest 5 - I want to add conversation memory to my app
https://techcommunity.microsoft.com/t5/microsoft-developer-community/quest-5-i-want-to-add-conversation-memory-to-my-app/ba-p/4425013
<P>In this quest, you’ll explore how to build GenAI apps using a modern JavaScript AI framework, LangChain.js. LangChain.js helps you orchestrate prompts, manage memory, and build multi-step AI workflows all while staying in your favorite language.</P>
<P> </P>
<P>Using LangChain.js you will make your GenAI chat app feel truly personal by teaching it to remember. In this quest, you’ll upgrade your AI prototype with conversation memory, allowing it to recall previous interactions making the conversation flow more naturally and human-like.</P>
<P> </P>
<P>👉 Want to catch up on the full program or grab more quests? <A href="https://aka.ms/JSAIBuildathon" target="_blank">https://aka.ms/JSAIBuildathon</A></P>
<P>💬 Got questions or want to hang with other builders? Join us on <A href="https://aka.ms/JSAIonDiscord" target="_blank">Discord</A> — head to the #js-ai-build-a-thon channel.</P>
<H2>🔧 What You’ll Build</H2>
<P>A smarter, context-aware chat backend that:</P>
<UL>
<LI>Remembers user conversations across multiple exchanges (e.g., knowing "Terry" after you introduced yourself as Terry) </LI>
<LI>Maintains session-specific memory so each chat thread feels consistent and coherent </LI>
<LI>Uses LangChain.js abstractions to streamline state management.</LI>
</UL>
<H2>🚀 What You’ll Need</H2>
<OL>
<LI>✅ A GitHub account</LI>
<LI>✅ <A href="https://code.visualstudio.com/" target="_blank">Visual Studio Code</A></LI>
<LI>✅ <A href="https://nodejs.org/en" target="_blank">Node.js</A></LI>
<LI>✅ A working chat app from previous quests (UI + Azure-based chat endpoint)</LI>
</OL>
<H2>🛠️ Concepts You’ll Explore</H2>
<P> </P>
<img />
<H3>Integrating LangChain.js</H3>
<P>Learn how LangChain.js simplifies building AI-powered web applications by providing a standard interface to connect your backend with Azure’s language models. You’ll see how using this framework decouples your code and unlocks advanced features.</P>
<P> </P>
<img />
<H3>Adding Conversation Memory</H3>
<P>Understand why memory matters in chatbots. Explore how conversation memory lets your app remember previous user messages within each session enabling more context-aware and coherent conversations.</P>
<P> </P>
<img />
<H3>Session-based Message History</H3>
<P>Implement session-specific chat histories using LangChain’s memory modules (ChatMessageHistory and BufferMemory). Each user or session gets its own history, so previous questions and answers inform future responses without manual log management.</P>
<H3>Seamless State Management</H3>
<P>Experience how LangChain handles chat logs and memory behind the scenes, freeing you from manually stitching together chat history or juggling context with every prompt.</P>
<H2>📖 Bonus Resources to Go Deeper</H2>
<UL>
<LI><A href="https://techcommunity.microsoft.com/blog/educatordeveloperblog/langchain-js--azure-a-generative-ai-app-journey/4101258" data-lia-auto-title-active="0" data-lia-auto-title="Exploring Generative AI in App Development: LangChain.js and Azure" target="_blank">Exploring Generative AI in App Development: LangChain.js and Azure</A>: a video introduction to LangChain.js and how you can build a project with LangChain.js and Azure</LI>
<LI><A href="https://js.langchain.com/" data-lia-auto-title-active="0" data-lia-auto-title="🦜️🔗 Langchain" target="_blank">🦜️🔗 Langchain</A>: the official LangChain.js documentation.</LI>
<LI><A href="https://github.com/Azure-Samples/serverless-chat-langchainjs" data-lia-auto-title-active="0" data-lia-auto-title="GitHub - Azure-Samples/serverless-chat-langchainjs: Build your own serverless AI Chat with Retrieval-Augmented-Generation using LangChain.js, TypeScript and Azure" target="_blank">GitHub - Azure-Samples/serverless-chat-langchainjs: Build your own serverless AI Chat with Retrieval-Augmented-Generation using LangChain.js, TypeScript and Azure</A>: A GitHub sample that helps you build your own serverless AI Chat with Retrieval-Augmented-Generation using LangChain.js, TypeScript and Azure</LI>
<LI><A href="https://github.com/Azure-Samples/langchainjs-quickstart-demo" data-lia-auto-title-active="0" data-lia-auto-title="GitHub - Azure-Samples/langchainjs-quickstart-demo: Build a generative AI application using LangChain.js, from local to Azure" target="_blank">GitHub - Azure-Samples/langchainjs-quickstart-demo: Build a generative AI application using LangChain.js, from local to Azure</A>: A GitHub sample that helps you build a generative AI application using LangChain.js, from local to Azure.</LI>
<LI><A class="lia-external-url" href="https://js.langchain.com/docs/integrations/platforms/microsoft/" data-lia-auto-title-active="0" data-lia-auto-title="Microsoft | 🦜️🔗 Langchain" target="_blank">Microsoft | 🦜️🔗 Langchain</A> Official LangChain documentation on all functionalities related to Microsoft and Microsoft Azure.</LI>
<LI><A class="lia-internal-link lia-internal-url lia-internal-url-content-type-blog" href="https://techcommunity.microsoft.com/blog/azuredevcommunityblog/quest-4---i-want-to-connect-my-ai-prototype-to-external-data-using-rag/4424171" data-lia-auto-title="Quest 4 - I want to connect my AI prototype to external data using RAG | Microsoft Community Hub" data-lia-auto-title-active="0" target="_blank">Quest 4 - I want to connect my AI prototype to external data using RAG | Microsoft Community Hub</A> a link to the previous quest instructions.</LI>
</UL>Thu, 19 Jun 2025 07:00:00 GMThttps://techcommunity.microsoft.com/t5/microsoft-developer-community/quest-5-i-want-to-add-conversation-memory-to-my-app/ba-p/4425013bethanyjep2025-06-19T07:00:00ZEngaging Employees: A Journey Through Data Analytics
https://techcommunity.microsoft.com/t5/educator-developer-blog/engaging-employees-a-journey-through-data-analytics/ba-p/4421284
<DIV class="lia-embeded-content" contenteditable="false"><IFRAME src="https://www.youtube.com/embed/anaaXDsRCCY?si=LTC2JGO8n6bYOgkg" width="560" height="315" title="YouTube video player" allowfullscreen="allowfullscreen" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" frameborder="0" sandbox="allow-scripts allow-same-origin allow-forms"></IFRAME></DIV>
<H6><STRONG>Our Team </STRONG>(<EM>Sorted Alphabetically</EM>)<STRONG>:</STRONG></H6>
<DIV class="styles_lia-table-wrapper__h6Xo9 styles_table-responsive__MW0lN"><table border="1" style="width: 100%; height: 92.4px; border-width: 1px;"><colgroup><col style="width: 49.3219%" /><col style="width: 4.2706%" /><col style="width: 46.3781%" /></colgroup><tbody><tr style="height: 30.8px;"><td class="lia-align-right" style="height: 30.8px;"><A class="lia-external-url" href="https://www.linkedin.com/in/ashkan-allahyari/" target="_blank" rel="noopener">Ashkan Allahyari</A></td><td class="lia-align-center">|</td><td style="height: 30.8px;">
<P><A href="mailto:[email protected]" target="_blank" rel="noopener">[email protected]</A></P>
</td></tr><tr style="height: 30.8px;"><td class="lia-align-right" style="height: 30.8px;"><A class="lia-external-url" href="https://www.linkedin.com/in/ole-bekker/" target="_blank" rel="noopener">Ole Bekker</A></td><td class="lia-align-center">|</td><td style="height: 30.8px;">
<P><A href="mailto:[email protected]" target="_blank" rel="noopener">[email protected]</A></P>
</td></tr><tr style="height: 30.8px;"><td class="lia-align-right" style="height: 30.8px;"><A class="lia-external-url" href="https://www.linkedin.com/in/robin-elster-5a703122b/" target="_blank" rel="noopener">Robin Elster</A></td><td class="lia-align-center">|</td><td style="height: 30.8px;">
<P><A href="mailto:[email protected]" target="_blank" rel="noopener">[email protected]</A></P>
</td></tr><tr><td class="lia-align-right"><A class="lia-external-url" href="https://www.linkedin.com/in/waad-hegazy-310712167/" target="_blank" rel="noopener">Waad Hegazy</A></td><td class="lia-align-center">|</td><td>
<P><A href="mailto:[email protected]" target="_blank" rel="noopener">[email protected]</A></P>
</td></tr><tr><td class="lia-align-right"><A class="lia-external-url" href="https://www.linkedin.com/in/lea-hierl/" target="_blank" rel="noopener">Lea Hierl</A></td><td class="lia-align-center">|</td><td>
<P><A href="mailto:[email protected]" target="_blank" rel="noopener">[email protected]</A></P>
</td></tr><tr><td class="lia-align-right">Linda Pham</td><td class="lia-align-center">|</td><td>
<P><A href="mailto:[email protected]" target="_blank" rel="noopener">[email protected]</A></P>
</td></tr></tbody></table></DIV>
<P class="lia-align-center"> Master Business Analysis & Modelling, Radboud University</P>
<P class="lia-align-center">Master Strategic Human Resources Leadership, Radboud University</P>
<P class="lia-align-center">Student Exchange at Radboud University</P>
<H5><STRONG>Project Overview</STRONG></H5>
<P>At Radboud University, our team in the course Data-Driven Analytics for Responsible Business Solutions, embraced a unique opportunity to apply our passion for data analytics to a real-world challenge. Tasked with analyzing employee turnover at VenturaGear—a company committed to fostering a thriving workplace—we conducted an in-depth study to uncover the root causes of attrition. By leveraging rigorous data analytics, machine learning techniques, and Power BI visualizations, we identified key drivers of employee satisfaction and retention. This blog outlines our approach, findings, and actionable strategies to address similar challenges.</P>
<H5><STRONG>Understanding the Turnover Challenge</STRONG></H5>
<P>VenturaGear, a company with diverse operational units, faced rising employee turnover, prompting our data analytics team to identify its root causes. Initial concerns highlighted external competition and internal workplace factors, but the broad question of “why are employees leaving?” required refinement for actionable insights. To tackle this, we first reviewed the dataset and its accompanying data manual to gain a comprehensive understanding of the available data. We then conducted exploratory data analysis on current employee records, focusing on key variables: organizational hierarchy (e.g., employee level), tenure, business units (e.g., inventory, manufacturing), pay frequency, pay rate, and performance metrics.</P>
<P>This approach led us to refine our research question: How do organizational hierarchy, tenure, business unit, pay structure, and performance metrics influence employee satisfaction and turnover at VenturaGear? By narrowing the scope, we could systematically explore these factors, uncovering patterns and drivers of attrition to set the stage for targeted recommendations.</P>
<H5><STRONG>Data Preparation</STRONG></H5>
<P>To answer our refined research question effectively, we needed to examine VenturaGear’s turnover challenge from multiple perspectives. Using Power BI, we created a comprehensive data model to avoid tunnel vision and ensure a holistic analysis. This began with a thorough review of the dataset to assess data quality, checking for inconsistencies, data type accuracy, and missing values (Hair et al., 2018). For instance, we converted the "EndDate" field in the HR_EmployeeDepartmentHistory table from text to date format and corrected errors, such as one employee’s "EndDate" preceding their "StartDate," which was removed to maintain data integrity. We streamlined the analysis by importing only relevant tables from the Human Resources dataset, establishing relationships using the data dictionary and verifying cardinality to prevent errors (Arnold, 2022). </P>
<img />
<P>To enhance our analysis, we developed several new columns and measures, including WorkingLifeTime, calculated with DATEDIFF to assess employee tenure, and ActiveCountTrue and ActiveRatioTrue to evaluate active employee statuses. We also categorized organizational levels with OrgLevelLabel to reflect VenturaGear’s hierarchy. To ensure accurate pay analysis, we implemented the Interquartile Range (IQR) method for rate outlier detection. While we flagged missing data in fields like BirthDate and HireDate, we left these unchanged to avoid making assumptions. This data preparation allowed us to conduct a thorough analysis of turnover drivers across various levels, departments, and pay structures.</P>
<H5><STRONG>Our Journey in Analysis</STRONG></H5>
<P>With a robust data model in place, our journey in analysis focused on translating VenturaGear’s employee data into actionable insights. We began by analyzing turnover within different units to understand the issue, then segmented employees by tenure, units, and levels using the MECE framework for a deeper understanding (Rasiel, 1999). Next, we measured well-being through job satisfaction, pay frequency, and rate, identifying markers hinting toward extrinsic motivation. To quantify these influences, we developed a machine learning model using Microsoft Fabric AutoML, treating satisfaction as a continuous variable (Müller & Guido, 2016). The model’s feature importance analysis revealed that pay frequency (0.55) and organizational level (0.26) were the strongest predictors, while pay rate and shift ID had minimal impact.</P>
<img />
<P>However, the trained regression machine-learning model’s prediction performance (R² of 20%, MAE of 3.63, MAPE of 5.75%) posed a challenge, indicating that job satisfaction was influenced by complex factors not fully captured by our variables. This led us to pivot toward descriptive analytics, leveraging Power BI to visualize trends and patterns. We designed two key dashboards: the “Turnover Overview” dashboard, and the “Job Satisfaction and Pay Rate” dashboard, to guide management through key insights.</P>
<DIV class="styles_lia-table-wrapper__h6Xo9 styles_table-responsive__MW0lN"><table border="1" style="width: 100%; border-width: 1px;"><tbody><tr><td><img />
<P> </P>
</td><td><img />
<P> </P>
</td></tr></tbody></table></DIV>
<P>A critical lesson emerged during visualization design: aspect ratios can distort perceptions of trends. Wider graphs presented balanced trends, while narrower ones exaggerated steepness, risking misinterpretation (Peltier et al., 2021). We carefully adjusted our dashboards to balance clarity and accuracy, ensuring axes were scaled appropriately for each salary group to avoid misleading comparisons (Fisher et al., 2021). This analytical journey—from modeling to visualization—enabled us to uncover nuanced insights into turnover and satisfaction, setting the foundation for actionable recommendations.</P>
<img />
<H5><STRONG>Key Findings and Insights</STRONG></H5>
<P>Our analytical journey revealed critical insights into VenturaGear’s turnover challenges, combining descriptive analytics with Power BI visualizations. The regression model, despite its limitations, guided our exploration, while carefully designed dashboards provided clear, actionable trends.</P>
<img />
<OL>
<LI aria-level="1">Turnover Patterns: Only 11% of employees with recorded “EndDates” left VenturaGear; many changed departments, suggesting lower true turnover than initially assumed. The inventory department exhibited the highest relative turnover, while manufacturing showed the lowest. Departures often peaked in the second year, pointing to a mismatch between onboarding expectations and job reality (Lievens et al., 2001).</LI>
<LI aria-level="1">Motivation Dynamics: Lower-level employees rely heavily on extrinsic motivators, such as bi-weekly pay, which boosts short-term satisfaction but fails to foster long-term commitment (Ryan & Deci, 2000; Bénabou & Tirole, 2003).</LI>
<LI aria-level="1">Performance Metrics: Keyboard presses per hour, the only universal performance metric, varies by role and shows no clear link to satisfaction. This metric may create a sense of monitoring, undermining employee autonomy and trust (Ryan & Deci, 2000).</LI>
<LI aria-level="1">Departmental Challenges: The inventory department’s high turnover and low satisfaction highlight unit-specific issues requiring targeted interventions.</LI>
</OL>
<img />
<P>These findings emphasize the need for VenturaGear to balance extrinsic and intrinsic motivators to enhance employee engagement and reduce turnover. For a deeper dive into our visualizations, explore our Power BI Workspace.</P>
<H5><STRONG>Actionable Recommendations</STRONG></H5>
<P>Our findings highlight the need for VenturaGear to address both extrinsic and intrinsic motivators to reduce turnover and enhance engagement. Based on our analysis, we propose the following strategies:</P>
<H6><STRONG>Short-Term Actions</STRONG></H6>
<UL>
<LI aria-level="1">Competitive Wages: Conduct a market analysis to ensure pay rates align with industry standards, as pay frequency and rate are key extrinsic motivators (Ryan & Deci, 2000).</LI>
<LI aria-level="1">Increase Pay Frequency: Shift to bi-weekly pay for all employees, if financially feasible, to strengthen the link between work and rewards.</LI>
</UL>
<img />
<H6><STRONG>Long-Term Strategies</STRONG></H6>
<UL>
<LI aria-level="1">Foster Intrinsic Motivation: Provide clear career growth pathways to enhance competence, implement</LI>
</UL>
<H5><STRONG>Conclusion</STRONG></H5>
<P>Our analysis of VenturaGear’s employee turnover highlights the value of data-driven insights for complex workplace issues. By utilizing systematic data preparation and impactful Power BI visualizations, we identified key attrition drivers, including pay frequency and challenges within the inventory department. While our regression model pinpointed important factors, we found that intrinsic motivation plays a crucial role in long-term employee commitment. This process provided VenturaGear with actionable insights and demonstrated the importance of blending academic rigor with practical solutions, paving the way for targeted strategies to foster a more engaged workforce.</P>
<H5><STRONG>References</STRONG></H5>
<P>Arnold, J. (2022). Learning Microsoft Power BI. " O'Reilly Media, Inc.".</P>
<P>Bénabou, R., & Tirole, J. (2003). Intrinsic and extrinsic motivation. The Review of Economic Studies, 70(3), 489-520.</P>
<P>Fisher, J., Chang, R., & Wu, E. (2021). Automatic Y-axis Rescaling in Dynamic</P>
<P>Hair, Black, Babin and Anderson (2018), Multivariate Data Analysis (8th edition), Cengage Learning. Print ISBN: 9781473756540</P>
<P>Lievens, F., Decaesteker, C., Coetsier, P., & Geirnaert, J. (2001). Organizational attractiveness for prospective applicants: A person–organisation fit perspective. Applied Psychology, 50(1), 30-51.</P>
<P>Müller, A. C., & Guido, S. (2016). Introduction to machine learning with Python: a guide for data scientists. " O'Reilly Media, Inc.".</P>
<P>Peltier, C., McKenna, J., Sinclair, T., Garwood, J., & Vannest, K. (2021). Brief Report: Ordinate Scaling and Axis Proportions of Single-Case Graphs in Two Prominent EBD Journals From 2010 to 2019. Behavioral Disorders, 47, 134 - 148.<A href="https://doi.org/10.1177/0198742920982587" target="_blank" rel="noopener"> https://doi.org/10.1177/0198742920982587</A></P>
<P>Rasiel, E. M. (1999). The McKinsey Way. McGraw-Hill.</P>
<P>Ryan, R. M., & Deci, E. L. (2000). Intrinsic and extrinsic motivations: Classic definitions and new directions. Contemporary Educational Psychology, 25(1), 54-67.</P>Thu, 19 Jun 2025 07:00:00 GMThttps://techcommunity.microsoft.com/t5/educator-developer-blog/engaging-employees-a-journey-through-data-analytics/ba-p/4421284ashkan_allahyari2025-06-19T07:00:00ZWhat is the best disk partition software for Windows 11/10 now?
https://techcommunity.microsoft.com/t5/windows-11/what-is-the-best-disk-partition-software-for-windows-11-10-now/m-p/4425389#M27159
<P>Hi everyone,</P><P>I'm currently managing a Windows PC (Windows 11 PC and Windows 10 laptop) and I need to make some changes to my disk partitions. I want to resize existing partitions, possibly create new ones, and ensure that everything is done safely without data loss.</P><P>As far as I know, the built-in tool Disk Management is very limited functions when it comes to advanced tasks like resizing system partitions or merging partitions without data loss. That's why I'm looking for third-party <STRONG>disk partition software</STRONG> that is reliable, user-friendly, and preferably free or reasonably priced.</P><P>If you've used any of these tools, could you share your experience? Which one do you think is the best for home users like me?</P>Thu, 19 Jun 2025 06:48:52 GMThttps://techcommunity.microsoft.com/t5/windows-11/what-is-the-best-disk-partition-software-for-windows-11-10-now/m-p/4425389#M27159Kitassen2025-06-19T06:48:52ZReleased the full version of M365 connectivity test tool (MCT) in GCC environment
https://techcommunity.microsoft.com/t5/deployment-networking/released-the-full-version-of-m365-connectivity-test-tool-mct-in/m-p/4425363#M1501
<P>I am pleased to announce that we have successfully released the full version of the M365 connectivity test tool (MCT) in the GCC environment today. This investment will ensure excellent network connectivity and user experience across all clouds, including around 5 millions users in GCC.</P>
<P>You can find the updated public documentation at [<A href="https://learn.microsoft.com/en-us/microsoft-365/enterprise/office-365-network-mac-perf-onboarding-tool" target="_blank">here</A>] and the table below shows the feature comparison across all clouds.</P>
<P>Please let us know if you have questions, thanks</P>
<P> </P>
<img />
<P> </P>Thu, 19 Jun 2025 02:23:01 GMThttps://techcommunity.microsoft.com/t5/deployment-networking/released-the-full-version-of-m365-connectivity-test-tool-mct-in/m-p/4425363#M1501Ares Chen2025-06-19T02:23:01ZPurview Destruction of Records Not Working
https://techcommunity.microsoft.com/t5/microsoft-purview/purview-destruction-of-records-not-working/m-p/4425347#M2009
<P>Hi everyone, </P><P>I work for a Microsoft Partner Organization, we are experiencing issues with our Purview Implementations as the records are not being <STRONG>destroyed </STRONG>from SharePoint as expected event though the <STRONG>audit log</STRONG> registered all successful approvals for destructions including details on the stages, the comments and the label. We have waited 15 days as per the Purview's documentation and the content is still in the source library. Is anyone else experiencing a similar issue? The <STRONG>instability </STRONG>of the platform is forcing us to stop offering Purview as a solution to our customers and we are leaning more toward developing our own records management solution. </P><P>We have created a ticket, but Microsoft has not responded, there is no priority to resolve this issue and unfortunately our customers can't wait. </P><P>If anyone has a solution, please share it with the community. If there are any Purview experts from Microsoft that can offer any ideas, I would appreciate it. </P><P><STRONG>Note</STRONG>: Auditing is on for the organization and the email enabled security group and the Disposition Reviewers are in the correct roles: Disposition Reviewer, Records Management, Compliance Admin, List View Explorer and Content Explorer. Also, my configuration is set correctly, I have proof as the audit log indicates content has been approved for destruction weeks ago. </P><P>Thank you! </P>Thu, 19 Jun 2025 00:08:59 GMThttps://techcommunity.microsoft.com/t5/microsoft-purview/purview-destruction-of-records-not-working/m-p/4425347#M2009Nancy14152025-06-19T00:08:59ZHow to chart this?
https://techcommunity.microsoft.com/t5/excel/how-to-chart-this/m-p/4425339#M252613
<P>Need help on how to calculate then chart this. I have a number of pairs, which I’m imagining as a flow, but with some loops back, and branches:<BR />From To<BR />A G<BR />G C<BR />C D<BR />C A<BR />G F<BR />B E<BR />E F<BR />F E<BR />F D</P><img /><P>I’d like it to figure out a table/chart (but with arrows) like the attached image. It may have optional paths. Doesn't have to be like a flow chart, if there's another way for excel to analyze it. I don't *think* this is a complex b-tree sort of problem...<BR /><BR />TY in advance.</P><P> </P>Wed, 18 Jun 2025 23:11:08 GMThttps://techcommunity.microsoft.com/t5/excel/how-to-chart-this/m-p/4425339#M252613hovardbehunt2025-06-18T23:11:08ZOn-device AI and security: What really matters for the enterprise
https://techcommunity.microsoft.com/t5/surface-it-pro-blog/on-device-ai-and-security-what-really-matters-for-the-enterprise/ba-p/4424458
<P>AI is evolving, and so is the way businesses run it. Traditionally, most AI workloads have been processed in the cloud. When a user gives an AI tool a prompt, that input is sent over the internet to remote servers, where the model processes it and sends back a result. This model supports large-scale services like Microsoft 365 Copilot, which integrates AI into apps like Word, Excel, and Teams.</P>
<P>Now, a new capability is emerging alongside cloud-based AI. AI can also run directly on a PC—no internet connection or remote server required. This is known as on-device processing. It means the data and the model stay on the device itself, and the work is done locally.</P>
<P>Modern CPUs and GPUs are beginning to support this kind of processing. But neural processing units (NPUs), now included in enterprise-grade PCs such as Microsoft Surface Copilot+ PCs, are specifically designed to run AI workloads efficiently. NPUs are designed to perform the types of operations AI needs at high speed while using less power. That makes them ideal for features that need to work instantly, in a sustained fashion in the background, or without an internet connection.</P>
<H3>A flexible approach to AI deployment</H3>
<P>NPUs can enable power-efficient on-device processing, fast response times with small models, consistent functionality in offline scenarios, and more control over how data is processed and stored. For organizations, it adds flexibility in choosing how and where to run AI—whether to support real-time interactions at the edge or meet specific data governance requirements.</P>
<P>At the same time, cloud-based AI remains essential to how organizations deliver intelligent services across teams and workflows. Microsoft 365 Copilot, for example, is powered by cloud infrastructure and integrates deeply across productivity applications using enterprise-grade identity, access, and content protections.</P>
<P>Both models serve different but complementary needs. On-device AI adds new options for responsiveness and control. Cloud-based AI enables broad integration and centralized scale. Together, they give businesses flexibility to align AI processing with the demands of the use case, whether for fast local inference or connected collaboration.</P>
<P>For business and IT leaders, the question is not which model is better but how to use each effectively within a secure architecture. That starts with understanding where data flows, how it is protected, and what matters most at the endpoint.<STRONG> </STRONG></P>
<H3>Understanding AI data flow and its security impact</H3>
<P>AI systems rely on several types of input such as user prompts, system context, and business content. When AI runs in the cloud, data is transmitted to remote servers for processing. When it runs on the device, processing happens locally. Both approaches have implications for security.</P>
<P>With cloud AI, protection depends on the strength of the vendor’s infrastructure, encryption standards, and access controls. Security follows a shared responsibility model where the cloud provider secures the platform while the enterprise defines its policies for data access, classification, and compliance.</P>
<H3>Microsoft’s approach to data security and privacy in cloud AI services</H3>
<P>Although the purpose of this blog post is to talk about on-device AI and security, it’s worth a detour to briefly touch on how Microsoft approaches data governance across its cloud-based AI services. Ultimately, the goal is for employees to be able to use whatever tools work best for what they want to get done, and they may not differentiate between local and cloud AI services. That means having a trusted provider for both is important for long-term AI value and security in the organization.</P>
<P>Microsoft’s generative AI solutions, including Azure OpenAI Service and Copilot services and capabilities, do not use your organization’s data to train foundation models without your permission. The Azure OpenAI Service is operated by Microsoft as an Azure service; Microsoft hosts the OpenAI models in Microsoft's Azure environment and the Service does not interact with any services operated by OpenAI (e.g. ChatGPT, or the OpenAI API). Microsoft 365 Copilot and other AI tools operate within a secured boundary, pulling from organization-specific content sources like OneDrive and Microsoft Graph while respecting existing access permissions. For more resources on data privacy and security in Microsoft cloud AI services, check out <A href="https://learn.microsoft.com/search/?terms=data%20privacy%20and%20security%20for%20AI&category=Documentation" target="_blank">Microsoft Learn</A>.</P>
<H3>Local AI security depends on a trusted endpoint</H3>
<P>When AI runs on the device, the data stays closer to its source. This reduces reliance on network connectivity and can help limit exposure in scenarios where data residency or confidentiality is a concern. But it also means the device must be secured at every level.</P>
<P>Running AI on the device does not inherently make it more or less secure. It shifts the security perimeter. Now the integrity of the endpoint matters even more. Surface Copilot+ PCs are built with this in mind. As secured-core PCs, they integrate hardware-based protections that help guard against firmware, OS-level, and identity-based threats.</P>
<UL>
<LI>TPM 2.0 and Microsoft Pluton security processors provide hardware-based protection for sensitive data</LI>
<LI>Hardware-based root of trust verifies system integrity from boot-up</LI>
<LI>Microsoft-developed firmware can reduce exposure to third-party supply chain risks and helps address emerging threats rapidly via Windows Update</LI>
<LI>Windows Hello and Enhanced Sign-in Security (ESS) offer strong authentication at the hardware level</LI>
</UL>
<P>These protections and others work together to create a dependable foundation for local AI workloads. When AI runs on a device like this, the same enterprise-grade security stack that protects the OS and applications also applies to AI processing.</P>
<H5><STRONG>Why application design is part of the security equation</STRONG></H5>
<P>Protecting the device is foundational—but it’s not the whole story. As organizations begin to adopt generative AI tools that run locally, the security conversation must also expand to include how those tools are designed, governed, and managed.</P>
<P>The value of AI increases dramatically when it can work with rich, contextual data. But that same access introduces new risks if not handled properly. Local AI tools must be built with clear boundaries around what data they can access, how that access is granted, and how users and IT teams can control it. This includes opt-in mechanisms, permission models, and visibility into what’s being stored and why.</P>
<P>Microsoft Recall (preview) on Copilot+ PCs is a case study in how thoughtful application design can make local AI both powerful and privacy conscious. It captures snapshots of the desktop embedded with contextual information, enabling employees to find almost anything that has appeared on their screen by describing it in their own words. This functionality is only possible because Recall has access to a wide range of on-device data—but that access is carefully managed.</P>
<P>Recall runs entirely on the device. It is turned off by default—even when enabled by IT—and requires biometric sign-in with Windows Hello Enhanced Sign-in Security to activate. Snapshots are encrypted and stored locally, protected by Secured-core PC features and the Microsoft Pluton security processor. These safeguards ensure that sensitive data stays protected, even as AI becomes more deeply embedded in everyday workflows.</P>
<P>IT admins can manage Recall through Microsoft Intune, with policies to enable or disable the feature, control snapshot retention, and apply content filters. Even when Recall is enabled, it remains optional for employees, who can pause snapshot saving, filter specific apps or websites, and delete snapshots at any time.</P>
<P>This layered approach—secure hardware, secure OS, and secure app design—reflects Microsoft’s broader strategy for responsible local AI and aligns to the overall Surface security approach. It helps organizations maintain governance and compliance while giving users confidence that they are in control of their data and that the tools are designed to support them, not surveil them. This balance is essential to building trust in AI-powered workflows and ensuring that innovation doesn’t come at the expense of privacy or transparency. For more information, <A href="https://blogs.windows.com/windowsexperience/2024/09/27/update-on-recall-security-and-privacy-architecture/" target="_blank">check out the related blog post</A>.</P>
<H3>Choosing the right AI model for the use case</H3>
<P>Local AI processing complements cloud AI, offering additional options for how and where workloads run. Each approach supports different needs and use cases. What matters is selecting the right model for the task while maintaining consistent security and governance across the entire environment.</P>
<P>On-device AI is especially useful in scenarios where organizations need to reduce data movement or ensure AI works reliably in disconnected environments</P>
<UL>
<LI>In regulated industries such as finance, legal, or government, local processing can help support compliance with strict data-handling requirements</LI>
<LI>In the field, mobile workers can use AI features such as document analysis or image recognition without relying on a stable connection</LI>
<LI>For custom enterprise models, on-device execution through the Windows AI Foundry Local lets developers embed AI in apps while maintaining control over how data is used and stored</LI>
</UL>
<P>These use cases reflect a broader trend. Businesses want more flexibility in how they deploy and manage AI. On-device processing makes that possible without requiring a tradeoff in security or integration.</P>
<H2>Security fundamentals matter most</H2>
<P>Microsoft takes a holistic view of AI security across cloud services, on-device platforms, and everything in between. Whether your AI runs in Azure or on a Surface device, the same principles apply. Protect identity, encrypt data, enforce access controls, and ensure transparency.</P>
<P>This approach builds on the enterprise-grade protections already established across Microsoft’s technology stack. From the Secure Development Lifecycle to Zero Trust access policies, Microsoft applies rigorous standards to every layer of AI deployment.</P>
<P>For business leaders, AI security extends familiar principles—identity, access, data protection—into new AI-powered workflows, with clear visibility and control over how data is handled across cloud and device environments.</P>
<H3>Securing AI starts with the right foundations</H3>
<P>AI is expanding from cloud-only services to include new capable endpoints. This shift gives businesses more ways to match the processing model to the use case without compromising security.</P>
<P>Surface Copilot+ PCs support this flexibility by delivering local AI performance on a security-forward enterprise-ready platform. When paired with Microsoft 365 and Azure services, they offer a cohesive ecosystem that respects data boundaries and aligns with organizational policies.</P>
<P>AI security is not about choosing between cloud or device. It is about enabling a flexible, secure ecosystem where AI can run where it delivers the most value—on the endpoint, in the cloud, or across both. This adaptability unlocks new ways to work, automate, and innovate, without increasing risk. Surface Copilot+ PCs are part of that broader strategy, helping organizations deploy AI with confidence and control—at scale, at speed, and at the edge of what’s next.</P>Wed, 18 Jun 2025 21:21:16 GMThttps://techcommunity.microsoft.com/t5/surface-it-pro-blog/on-device-ai-and-security-what-really-matters-for-the-enterprise/ba-p/4424458kdhillon2025-06-18T21:21:16ZCount U slots
https://techcommunity.microsoft.com/t5/excel/count-u-slots/m-p/4425323#M252610
<P>Dear Experts, </P><P> I have a case like below:-</P><P>So, there's a Worksheet "Calc" below where we have Slots from 0~19 two times from Col (C~V) and (X~AQ);</P><img /><P>In ,Sheet2 we have the Logic on how to count the U slots:-</P><P>1D --> 1 symbol DL<BR />2DD --> 2 symbols DL<BR />2DU/2UD --> 1 symbol for UL, 1 symbol for DL<BR />1U --> 1 symbol UL<BR />2UU --> 2 symbols UL<BR />1S (for split) --> Half symbol for UL, Half symbol for DL</P><P>Can you please provide a formula for Column AS and AT to count the U slots, using above Logic,</P><img /><P> </P><P>Attached is the Worksheet.</P>Wed, 18 Jun 2025 21:08:48 GMThttps://techcommunity.microsoft.com/t5/excel/count-u-slots/m-p/4425323#M252610anupambit17972025-06-18T21:08:48ZExploring the Extensibility of ADMS Portal Customizations
https://techcommunity.microsoft.com/t5/microsoft-security-community/exploring-the-extensibility-of-adms-portal-customizations/ba-p/4425257
<H5><STRONG>A Comprehensive Overview</STRONG></H5>
<P class="lia-align-justify">The ADMS Portal is more than just a migration interface—it's a customizable, intelligent platform designed to streamline and enhance the migration experience for both users and IT administrators.</P>
<P>ADMS, ADSS, and ADGMS are all cloud-based services that come within the ADxS services portfolio offered by Microsoft and designed to facilitate efficient and cost-effective migrations. For additional information around migration use cases, refer to this blog: <A class="lia-internal-link lia-internal-url lia-internal-url-content-type-blog" href="https://techcommunity.microsoft.com/blog/microsoft-security-blog/exploring-the-use-cases-of-adxs-services/4373299?previewMessage=true" data-lia-auto-title="Exploring the Use Cases of ADxS Services | Microsoft Community Hub" data-lia-auto-title-active="0" target="_blank">Exploring the Use Cases of ADxS Services | Microsoft Community Hub</A> </P>
<P class="lia-align-justify">ADMS or Active Directory Migration Service – is a service designed to facilitate the migration of users and workstations across domains and forests by offering diverse number of migration methods such as Self-Service Migration which is unique to the ADMS service and it comes with two types, Self-Service for corporate connect users, and Self-Service for remote or VPN users, Admin automated Migrations, user only migration and Migration for workstations shared by more than one user. </P>
<H5><STRONG>Prerequisites for a User Migration</STRONG></H5>
<P class="lia-align-justify">Users must be in scope for the ADMS sync engine, meet all identity logic, and be in the migration database prior to coming to the ADMS Portal. One of the first items we perform is pre-provision or join source identities to target identities also working with your team to determine attributes to flow as part of the sync engine.</P>
<P class="lia-align-justify">ADMS Portal will submit each user to a set of preflight checks prior to allowing user migration. Before any migration begins, the ADMS Portal runs a standardized set of preflight validations designed to catch common issues that could disrupt the process. These checks are essential safeguards that ensure a smooth and secure migration from the source to the target environment.</P>
<P>Refer to this blog for more details: <A class="lia-internal-link lia-internal-url lia-internal-url-content-type-blog" href="https://techcommunity.microsoft.com/blog/microsoft-security-blog/ensuring-smooth-migrations-with-adms-portal%E2%80%99s-preflight-checks/4421401" target="_blank" rel="noopener" data-lia-auto-title="Ensuring Smooth Migrations with ADMS Portal’s Preflight Checks | Microsoft Community Hub" data-lia-auto-title-active="0">Ensuring Smooth Migrations with ADMS Portal’s Preflight Checks | Microsoft Community Hub</A></P>
<H5><STRONG>User Migration journey</STRONG></H5>
<P>Assuming the user meets the preflight checks for migration, the user is submitted to the activation phase. This phase includes the user object being enabled if necessary as well as being submitted to the ADMS AR Pipeline. </P>
<P class="lia-align-justify">The default delivery includes the objectSID of the source user being copied to the target user SIDHistory at user migration run-time in the ADMS AR pipeline. We also submit the user for any additional application/service remediation agreed upon during workshops.</P>
<P>Refer to this blog for more details: <A class="lia-internal-link lia-internal-url lia-internal-url-content-type-blog" href="https://techcommunity.microsoft.com/blog/microsoft-security-blog/exploring-the-use-cases-of-adms-user-migration/4410358" data-lia-auto-title="Exploring the Use Cases of ADMS User Migration | Microsoft Community Hub" data-lia-auto-title-active="0" target="_blank">Exploring the Use Cases of ADMS User Migration | Microsoft Community Hub</A> </P>
<P> </P>
<P><STRONG>Here’s a look at what the ADMS Portal can customize for a user migration:</STRONG></P>
<UL>
<LI class="lia-align-justify"> <STRONG>Preflight checks</STRONG>: ADMS team enables a standardized set of preflight validations designed to catch common issues that could disrupt the process. These checks are not just technical formalities—they are essential safeguards that ensure each migration proceeds smoothly and securely from old source environment to new target environment.</LI>
<LI class="lia-align-justify"><STRONG>Portal Landing Page</STRONG>: The ADMS Portal landing page can be configured to let the user choose from one or more connection options. This includes but not limited to at a remote location over VPN.</LI>
<LI class="lia-align-justify"><STRONG>Multi-language support</STRONG>: The ADMS Portal can be configured to allow the user's local language to be displayed in their browser providing a richer user experience for the various use cases brought to the migration portal.</LI>
<LI class="lia-align-justify"> <STRONG>Customer Support Contact</STRONG>: ADMS team will configure the ADMS Portal to display the customer support contact information to help the user experience a better escalation path if any issues do occur during their migration journey.</LI>
<LI class="lia-align-justify"><STRONG>Identity Enablement</STRONG>: ADMS team has the ability to enable target user identities during the staging queue process during user migration.</LI>
<LI class="lia-align-justify"><STRONG>Identity Sync Engine:</STRONG> Conventional tools synchronize Active Directory objects as-is to the target domain and refresh them as changes are made in the source. ADMS implements a rich and robust identity management system so that just the right identities, groups, group memberships and workstations are synchronized and provisioned and will continuously run until the migration has been completed to accommodate changes in the source.</LI>
<LI class="lia-align-justify"><STRONG>ADMS AR Pipeline</STRONG>: The ADMS delivery team can handle at run-time remediation in the ADMS AR pipeline. This is done per user at user migration run-time to allow coexistence, maintaining access for those pending migration, and updating access for those that have performed migration through the ADMS Portal. Refer to this blog for more details: <A class="lia-internal-link lia-internal-url lia-internal-url-content-type-blog" href="https://techcommunity.microsoft.com/blog/microsoft-security-blog/exploring-the-use-cases-of-adms-application-pipeline/4404097" data-lia-auto-title="Exploring the Use Cases of ADMS Application Pipeline | Microsoft Community Hub" data-lia-auto-title-active="0" target="_blank">Exploring the Use Cases of ADMS Application Pipeline | Microsoft Community Hub</A> </LI>
<LI class="lia-align-justify"><STRONG>Feature Enablement</STRONG>: ADMS delivery includes the ability to enable SIDHistory feature at user migration run-time. ADMS delivery can include the ability to enable our password sync feature one way from source to target.</LI>
<LI class="lia-align-justify"><STRONG>Post User Migration</STRONG>: ADMS team has the ability to disable the source user identities post user migration after an agreed upon grace period.</LI>
<LI class="lia-align-justify"><STRONG>Custom Preflight Check</STRONG>: ADMS team has the ability to add custom preflight checks for some migration use cases.</LI>
</UL>
<H5><STRONG>Device Migration journey</STRONG></H5>
<P class="lia-align-justify">Conventional tools require mapping of users to workstations so that the migration sequence can be structured and run by the migration team. ADMS offers a simple to use portal service so that self-service migrations can be offered with users now able to migrate when it's convenient for them. </P>
<P>Refer to this blog for more details: <A class="lia-internal-link lia-internal-url lia-internal-url-content-type-blog" href="https://techcommunity.microsoft.com/blog/microsoft-security-blog/exploring-the-extensibility-of-active-directory-migration-service-adms-device-mi/4397075" data-lia-auto-title="Exploring the Extensibility of Active Directory Migration Service (ADMS) Device Migration | Microsoft Community Hub" data-lia-auto-title-active="0" target="_blank">Exploring the Extensibility of Active Directory Migration Service (ADMS) Device Migration | Microsoft Community Hub</A> </P>
<P> </P>
<P><STRONG>Here’s a look at what the ADMS Portal can customize for a device migration:</STRONG></P>
<UL>
<LI class="lia-align-justify"><STRONG>Preflight checks</STRONG>: ADMS WMT service is a requirement for a device to be eligible for migration. This is enabled by default. </LI>
<LI class="lia-align-justify"><STRONG>Identity Sync Engine</STRONG>: Conventional tools synchronize Active Directory objects as-is to the target domain and refresh them as changes are made in the source. ADMS implements a rich and robust identity management system so that just the right identities, groups, group memberships and workstations are synchronized and provisioned and will continuously run until the migration has been completed to accommodate changes in the source.</LI>
<LI class="lia-align-justify"><STRONG>WMT Service</STRONG>: ADMS WMT service is used for conducting workstation migration operations. ADMS WMT service performs at device migration runtime when invoked by ADMS Portal or our auto migration app to perform our default migration operations as well as any additional features we agreed upon during our design discussions for your ADMS delivery.</LI>
<LI class="lia-align-justify"><STRONG>WMT Service Custom External Scripts</STRONG>: ADMS WMT service has the option to run custom PowerShell external scripts. ADMS WMT service allows the extensibility to run custom external scripts at various execution points during the device migration sub-steps, which has been a game changer for our ADMS customers. There will be more on this in a future blog post. </LI>
<LI class="lia-align-justify"><STRONG>Approved to Migrate Computer Check</STRONG>: This is an optional check that can be enabled to look for a registry key on the device.</LI>
<LI class="lia-align-justify"><STRONG>Remote/VPN IP Range Check</STRONG>: ADMS Portal has the ability to use an IP range provided by the customer to determine if a client is attempting migration from a corporate network or remote connection. </LI>
<LI class="lia-align-justify"><STRONG>AutoMigApp - Device Only</STRONG>: ADMS team has the ability to generate a package of auto migration app designed to perform a device only migration. </LI>
<LI class="lia-align-justify"><STRONG>Custom Preflight Check</STRONG>: ADMS team has the ability to add custom preflight checks for some migration use cases.</LI>
</UL>
<H5><STRONG>ADMS Portal Benefits</STRONG></H5>
<P>The ADMS Portal offers a robust and flexible platform that enhances the migration experience for both users and administrators. Here are the key benefits:</P>
<UL>
<LI class="lia-align-justify"><STRONG>Streamlined User Experience</STRONG>: With customizable landing pages, multilingual support, and integrated customer support contact information, the portal ensures a smooth and intuitive experience for end users.</LI>
<LI class="lia-align-justify"><STRONG>Comprehensive Preflight Checks</STRONG>: Built-in and customizable preflight validations help identify and resolve potential issues before migration begins, reducing downtime and ensuring a higher success rate.</LI>
<LI class="lia-align-justify"><STRONG>Flexible Identity Management</STRONG>: The Identity Sync Engine and Identity Enablement features allow for precise control over which users, groups, and devices are migrated, ensuring alignment with organizational policies.</LI>
<LI class="lia-align-justify"><STRONG>Real-Time Remediation</STRONG>: The ADMS AR Pipeline supports runtime remediation, enabling coexistence and seamless access transitions during the migration process. </LI>
<LI class="lia-align-justify"><STRONG>Advanced Device Migration Support</STRONG>: The WMT service and AutoMigApp provide powerful tools for device-only migrations, including support for custom scripts and remote/VPN IP range checks.</LI>
<LI class="lia-align-justify"><STRONG>Post-Migration Controls</STRONG>: The service supports post-migration actions such as disabling source identities after a grace period, helping maintain security and compliance.</LI>
<LI class="lia-align-justify"><STRONG>Extensibility and Customization</STRONG>: From custom preflight checks to external script execution, the portal is designed to adapt to unique migration scenarios and enterprise needs.</LI>
</UL>
<H5><STRONG>Conclusion</STRONG></H5>
<P>ADMS is a service designed to facilitate the migration of users and workstations across domains and forests by offering diverse number of migration methods. ADxS services not only simplifies the migration process but also ensures that organizations can achieve their migration goals more efficiently and cost-effectively.</P>
<P> </P>
<P><STRONG>Learn more about IMS and explore its powerful migration capabilities today!</STRONG></P>
<UL>
<LI>Read our latest insights on the <A href="https://techcommunity.microsoft.com/tag/identity%20migration%20service?nodeId=board%3Amicrosoft-security-blog" target="_blank" rel="noopener">IMS blog</A> </LI>
<LI>Learn more about IMS and start hassle-free migrations and its capabilities today! On our <A href="https://www.youtube.com/channel/UCrkUBYVf1l_7oVdQNMtoJew" target="_blank" rel="noopener">YouTube Channel</A> </LI>
<LI>Want to speak with an expert? Reach out to us at <A href="mailto:[email protected]" target="_blank" rel="noopener">[email protected]</A> to connect with a sales representative. Let’s power the future of digital collaboration — together.</LI>
</UL>Wed, 18 Jun 2025 20:16:38 GMThttps://techcommunity.microsoft.com/t5/microsoft-security-community/exploring-the-extensibility-of-adms-portal-customizations/ba-p/4425257jasoncox2025-06-18T20:16:38ZJoin us for our Community Call TOMORROW!
https://techcommunity.microsoft.com/t5/connect-and-ask-questions/join-us-for-our-community-call-tomorrow/m-p/4425314#M24
<P><STRONG>Don't miss tomorrow's community call! </STRONG></P>
<P>We will walk-thru our new MSLE community and demonstrate how to use the new Learning Download Center.</P>
<P><STRONG>Date: </STRONG>Thursday, June 19, 2025</P>
<P><STRONG>Time: </STRONG>10:00 - 11:00 AM PDT (Pacific Daylight Time) </P>
<P><STRONG>Agenda: </STRONG>The new MSLE Community </P>
<P> </P>
<P>From the <A class="lia-external-url" href="https://aka.ms/MSLECommunity%20" target="_blank">MSLE Community overview page</A> select "<STRONG>Events</STRONG>" and then select "<STRONG>Microsoft Learn for Educator Events</STRONG>". Select "<STRONG>June MSLE Community Call</STRONG>" and from there you can add the event to your calendar. Select "Attending" or "Interested" to receive notifications and updates regarding the event.</P>
<P> </P>
<P><STRONG><U>What are MSLE Community Calls?</U></STRONG> </P>
<P>Each MSLE Community Call provides a unique experience to learn about topics that impact you or your students and highlights the amazing work of the MSLE Community. You'll hear from rotating presenters including members of the MSLE Program Team, other groups at Microsoft, and fellow MSLE educators. All Community Calls are delivered in English with live captions available. All sessions are recorded. </P>
<P> </P>Wed, 18 Jun 2025 19:34:38 GMThttps://techcommunity.microsoft.com/t5/connect-and-ask-questions/join-us-for-our-community-call-tomorrow/m-p/4425314#M24RobinLBaldwin2025-06-18T19:34:38ZLogs not available for PDF applied with sensitivity label
https://techcommunity.microsoft.com/t5/microsoft-purview/logs-not-available-for-pdf-applied-with-sensitivity-label/m-p/4425312#M2008
<P>We created sensitivity labels for files and can apply them to files (docx, xlsx, pdf). However, we found that there were no activity logs for PDF in activity explorer nor in audit search. Activity logs were available for MS office documents (docx, xlsx). Is there any way we can enable logging for PDF documents with labelled content?</P><P>Thanks</P>Wed, 18 Jun 2025 19:28:16 GMThttps://techcommunity.microsoft.com/t5/microsoft-purview/logs-not-available-for-pdf-applied-with-sensitivity-label/m-p/4425312#M2008JamesY6502025-06-18T19:28:16ZLockheed Martin & Librestream boost mission readiness with secure video collaboration platform
https://techcommunity.microsoft.com/t5/azure-communication-services/lockheed-martin-librestream-boost-mission-readiness-with-secure/ba-p/4425309
<P><EM>The following is a public press release from Lockheed Martin & Librestream announcing the launch of their secure video collaboration platform.<BR /><BR /></EM>In an effort to continue ensuring armed forces are ready for any mission, <A class="lia-external-url" href="https://www.lockheedmartin.com/" target="_blank">Lockheed Martin</A> and Librestream have partnered with Microsoft to revolutionize defense sustainment with the introduction of Onsight NOW, an advanced remote collaboration platform.</P>
<P>Onsight NOW enables seamless communication, enhances teamwork and boosts productivity through its advanced features, including chat, annotations and multi-user calling capabilities. The tool operates on Microsoft's Azure Government Cloud, which provides a secure and reliable method for defense operations teams to collaborate in real-time, enabling them to share live video feeds, annotate critical information, and capture vital information – all while maintaining the highest levels of security and compliance.</P>
<P><A class="lia-external-url" href="https://learn.microsoft.com/en-us/azure/azure-government/documentation-government-welcome" target="_blank" rel="noopener">Azure Government services</A> provide a strong and secure foundation for cloud computing that meets the high standards of defense agencies. This includes being certified for FedRAMP, which is a strict set of federal security guidelines.</P>
<P>“Lockheed Martin’s implementation of Onsight NOW on Azure Government is a testament to its unwavering commitment to pioneering secure, cutting-edge solutions in defense sustainment. By leveraging Onsight NOW, Lockheed Martin is not only advancing operational efficiency but also setting a new benchmark for secure, interoperable video collaboration across the defense sector,” said Dan Flynn, Managing Director, Global Defense at Librestream.</P>
<P>Onsight NOW on Azure Government is installed securely within Lockheed Martin’s own cloud environment, which ensures sensitive information is protected and handled correctly. This permits both Lockheed Martin and external partners to securely collaborate in real time.</P>
<P>“Aligned with 1LMX – our mission-driven business and digital transformation program – Onsight NOW will ultimately deliver the speed, agility and insights our customers need to be ready to address growing security threats across the world,” says Drew Robbins, Vice President of Sustainment Operations at Lockheed Martin. “By facilitating timely maintenance, enhancing collaboration, reducing costs and ensuring compliance, Onsight NOW will ultimately improve readiness for critical missions, ensuring air dominance for America and its allies.”</P>
<P>“We are thrilled to partner with Lockheed Martin and Librestream to launch Onsight NOW for Azure GOV,” said Bob Serr, VP of Engineering at Azure Communication Services.</P>
<P>“This collaboration exemplifies our commitment to providing secure and innovative solutions that empower defense agencies to maintain mission-critical assets with the highest standards of security and compliance.”<BR /><BR /><BR /></P>Wed, 18 Jun 2025 19:27:06 GMThttps://techcommunity.microsoft.com/t5/azure-communication-services/lockheed-martin-librestream-boost-mission-readiness-with-secure/ba-p/4425309kellymoon2025-06-18T19:27:06Z